Test Report: Docker_Linux_containerd_arm64 21808

                    
                      db33af8e7a29a5e500790b374373258f8b494afd:2025-12-17:42825
                    
                

Test fail (25/369)

Order failed test Duration
171 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy 502.31
173 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart 368.2
175 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods 2.33
185 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd 2.34
186 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly 2.41
187 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig 736.12
188 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth 2.16
191 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/InvalidService 0.06
194 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd 1.92
197 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd 3.1
201 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect 2.51
203 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim 241.63
213 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels 3.13
235 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/DeployApp 0.07
236 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/List 0.36
237 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/JSONOutput 0.32
239 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/HTTPS 0.35
240 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/Format 0.34
241 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/URL 0.34
243 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/RunSecondTunnel 0.49
246 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/WaitService/Setup 0.15
247 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessDirect 129.03
255 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/any-port 2.36
358 TestKubernetesUpgrade 794.69
498 TestStartStop/group/no-preload/serial/AddonExistsAfterStop 7200.07
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy (502.31s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy
functional_test.go:2239: (dbg) Run:  out/minikube-linux-arm64 start -p functional-682596 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1
E1217 20:17:32.880389  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:19:49.015139  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:20:16.729066  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:21:28.508393  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:21:28.514804  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:21:28.526236  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:21:28.547710  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:21:28.589192  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:21:28.670741  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:21:28.832348  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:21:29.154230  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:21:29.796354  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:21:31.077916  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:21:33.639408  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:21:38.760771  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:21:49.002062  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:22:09.484323  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:22:50.445847  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:24:12.367431  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:24:49.015195  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:2239: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-682596 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1: exit status 109 (8m20.8787724s)

                                                
                                                
-- stdout --
	* [functional-682596] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21808
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on user configuration
	* Using Docker driver with root privileges
	* Starting "functional-682596" primary control-plane node in "functional-682596" cluster
	* Pulling base image v0.0.48-1765661130-22141 ...
	* Found network options:
	  - HTTP_PROXY=localhost:34181
	* Please see https://minikube.sigs.k8s.io/docs/handbook/vpn_and_proxy/ for more details
	* Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Local proxy ignored: not passing HTTP_PROXY=localhost:34181 to docker env.
	! You appear to be using a proxy, but your NO_PROXY environment does not include the minikube IP (192.168.49.2).
	! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Generating "apiserver-kubelet-client" certificate and key
	[certs] Generating "front-proxy-ca" certificate and key
	[certs] Generating "front-proxy-client" certificate and key
	[certs] Generating "etcd/ca" certificate and key
	[certs] Generating "etcd/server" certificate and key
	[certs] etcd/server serving cert is signed for DNS names [functional-682596 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/peer" certificate and key
	[certs] etcd/peer serving cert is signed for DNS names [functional-682596 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/healthcheck-client" certificate and key
	[certs] Generating "apiserver-etcd-client" certificate and key
	[certs] Generating "sa" key and public key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000237436s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000252088s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000252088s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	* Related issue: https://github.com/kubernetes/minikube/issues/4172

                                                
                                                
** /stderr **
functional_test.go:2241: failed minikube start. args "out/minikube-linux-arm64 start -p functional-682596 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1": exit status 109
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-682596
helpers_test.go:244: (dbg) docker inspect functional-682596:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	        "Created": "2025-12-17T20:17:26.774929696Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 408854,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-17T20:17:26.844564666Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:2a6398fc76fc21dc0a77ac54600c2604c101bff52e66ecf65f88ec0f1a8cff2d",
	        "ResolvConfPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hostname",
	        "HostsPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hosts",
	        "LogPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77-json.log",
	        "Name": "/functional-682596",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-682596:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-682596",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	                "LowerDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268-init/diff:/var/lib/docker/overlay2/83c8e6311894730d80a5439b5d4991744e9cfa6d0015df9caca346d57baf92e8/diff",
	                "MergedDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/merged",
	                "UpperDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/diff",
	                "WorkDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-682596",
	                "Source": "/var/lib/docker/volumes/functional-682596/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-682596",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-682596",
	                "name.minikube.sigs.k8s.io": "functional-682596",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "8e0f8d4915f888f90df7adb000bd0e749885d304e33053e85751193487b627b9",
	            "SandboxKey": "/var/run/docker/netns/8e0f8d4915f8",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33163"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33164"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33167"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33165"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33166"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-682596": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "de:95:c1:d9:d4:32",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "9e66e4dbc8284f728f81715f37c51d8272e96fcac9fb378874c982b3077b6cc2",
	                    "EndpointID": "0db3c56cfb2be75c981ed53adcc07de7cd33db60d51c01b0e875c8d41cf02897",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-682596",
	                        "efc9468a7e55"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596: exit status 6 (308.016387ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E1217 20:25:42.878543  414001 status.go:458] kubeconfig endpoint: get endpoint: "functional-682596" does not appear in /home/jenkins/minikube-integration/21808-367595/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:248: status error: exit status 6 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image          │ functional-032730 image load --daemon kicbase/echo-server:functional-032730 --alsologtostderr                                                                   │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ ssh            │ functional-032730 ssh sudo cat /usr/share/ca-certificates/3694612.pem                                                                                           │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ ssh            │ functional-032730 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                        │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls                                                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ ssh            │ functional-032730 ssh sudo cat /etc/test/nested/copy/369461/hosts                                                                                               │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image load --daemon kicbase/echo-server:functional-032730 --alsologtostderr                                                                   │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls                                                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image save kicbase/echo-server:functional-032730 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ update-context │ functional-032730 update-context --alsologtostderr -v=2                                                                                                         │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ update-context │ functional-032730 update-context --alsologtostderr -v=2                                                                                                         │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image rm kicbase/echo-server:functional-032730 --alsologtostderr                                                                              │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ update-context │ functional-032730 update-context --alsologtostderr -v=2                                                                                                         │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls                                                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls                                                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image save --daemon kicbase/echo-server:functional-032730 --alsologtostderr                                                                   │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls --format yaml --alsologtostderr                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls --format short --alsologtostderr                                                                                                     │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls --format json --alsologtostderr                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls --format table --alsologtostderr                                                                                                     │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ ssh            │ functional-032730 ssh pgrep buildkitd                                                                                                                           │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │                     │
	│ image          │ functional-032730 image build -t localhost/my-image:functional-032730 testdata/build --alsologtostderr                                                          │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls                                                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ delete         │ -p functional-032730                                                                                                                                            │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ start          │ -p functional-682596 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │                     │
	└────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/17 20:17:21
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1217 20:17:21.726426  408464 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:17:21.726544  408464 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:17:21.726548  408464 out.go:374] Setting ErrFile to fd 2...
	I1217 20:17:21.726552  408464 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:17:21.726806  408464 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:17:21.727217  408464 out.go:368] Setting JSON to false
	I1217 20:17:21.728018  408464 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":10787,"bootTime":1765991855,"procs":154,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:17:21.728075  408464 start.go:143] virtualization:  
	I1217 20:17:21.732855  408464 out.go:179] * [functional-682596] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1217 20:17:21.736651  408464 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 20:17:21.736757  408464 notify.go:221] Checking for updates...
	I1217 20:17:21.744130  408464 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:17:21.747648  408464 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:17:21.751390  408464 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:17:21.754752  408464 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 20:17:21.757946  408464 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 20:17:21.761443  408464 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:17:21.789893  408464 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:17:21.790020  408464 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:17:21.849589  408464 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:23 OomKillDisable:true NGoroutines:43 SystemTime:2025-12-17 20:17:21.840363338 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:17:21.849692  408464 docker.go:319] overlay module found
	I1217 20:17:21.853030  408464 out.go:179] * Using the docker driver based on user configuration
	I1217 20:17:21.856115  408464 start.go:309] selected driver: docker
	I1217 20:17:21.856125  408464 start.go:927] validating driver "docker" against <nil>
	I1217 20:17:21.856137  408464 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 20:17:21.856902  408464 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:17:21.910849  408464 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:23 OomKillDisable:true NGoroutines:43 SystemTime:2025-12-17 20:17:21.90199796 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:17:21.911005  408464 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1217 20:17:21.911226  408464 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1217 20:17:21.914292  408464 out.go:179] * Using Docker driver with root privileges
	I1217 20:17:21.917410  408464 cni.go:84] Creating CNI manager for ""
	I1217 20:17:21.917472  408464 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:17:21.917479  408464 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1217 20:17:21.917557  408464 start.go:353] cluster config:
	{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SS
HAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:17:21.920752  408464 out.go:179] * Starting "functional-682596" primary control-plane node in "functional-682596" cluster
	I1217 20:17:21.923735  408464 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1217 20:17:21.926625  408464 out.go:179] * Pulling base image v0.0.48-1765661130-22141 ...
	I1217 20:17:21.929425  408464 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:17:21.929482  408464 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1217 20:17:21.929489  408464 cache.go:65] Caching tarball of preloaded images
	I1217 20:17:21.929501  408464 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon
	I1217 20:17:21.929577  408464 preload.go:238] Found /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1217 20:17:21.929593  408464 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1217 20:17:21.929917  408464 profile.go:143] Saving config to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/config.json ...
	I1217 20:17:21.929939  408464 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/config.json: {Name:mk7e667c03cec200e74dbcb9647a4a92f028de4c Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:17:21.948528  408464 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon, skipping pull
	I1217 20:17:21.948538  408464 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 exists in daemon, skipping load
	I1217 20:17:21.948556  408464 cache.go:243] Successfully downloaded all kic artifacts
	I1217 20:17:21.948585  408464 start.go:360] acquireMachinesLock for functional-682596: {Name:mk49b95a4c72eb2d15a1ae0f35918a9843d0b3df Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1217 20:17:21.948695  408464 start.go:364] duration metric: took 95.451µs to acquireMachinesLock for "functional-682596"
	I1217 20:17:21.948720  408464 start.go:93] Provisioning new machine with config: &{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:
false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1217 20:17:21.948784  408464 start.go:125] createHost starting for "" (driver="docker")
	I1217 20:17:21.952131  408464 out.go:252] * Creating docker container (CPUs=2, Memory=4096MB) ...
	W1217 20:17:21.952423  408464 out.go:285] ! Local proxy ignored: not passing HTTP_PROXY=localhost:34181 to docker env.
	I1217 20:17:21.952448  408464 start.go:159] libmachine.API.Create for "functional-682596" (driver="docker")
	I1217 20:17:21.952469  408464 client.go:173] LocalClient.Create starting
	I1217 20:17:21.952533  408464 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem
	I1217 20:17:21.952563  408464 main.go:143] libmachine: Decoding PEM data...
	I1217 20:17:21.952577  408464 main.go:143] libmachine: Parsing certificate...
	I1217 20:17:21.952631  408464 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem
	I1217 20:17:21.952646  408464 main.go:143] libmachine: Decoding PEM data...
	I1217 20:17:21.952659  408464 main.go:143] libmachine: Parsing certificate...
	I1217 20:17:21.953002  408464 cli_runner.go:164] Run: docker network inspect functional-682596 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	W1217 20:17:21.968635  408464 cli_runner.go:211] docker network inspect functional-682596 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
	I1217 20:17:21.968715  408464 network_create.go:284] running [docker network inspect functional-682596] to gather additional debugging logs...
	I1217 20:17:21.968730  408464 cli_runner.go:164] Run: docker network inspect functional-682596
	W1217 20:17:21.985090  408464 cli_runner.go:211] docker network inspect functional-682596 returned with exit code 1
	I1217 20:17:21.985110  408464 network_create.go:287] error running [docker network inspect functional-682596]: docker network inspect functional-682596: exit status 1
	stdout:
	[]
	
	stderr:
	Error response from daemon: network functional-682596 not found
	I1217 20:17:21.985123  408464 network_create.go:289] output of [docker network inspect functional-682596]: -- stdout --
	[]
	
	-- /stdout --
	** stderr ** 
	Error response from daemon: network functional-682596 not found
	
	** /stderr **
	I1217 20:17:21.985223  408464 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1217 20:17:22.001590  408464 network.go:206] using free private subnet 192.168.49.0/24: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x400194c640}
	I1217 20:17:22.001631  408464 network_create.go:124] attempt to create docker network functional-682596 192.168.49.0/24 with gateway 192.168.49.1 and MTU of 1500 ...
	I1217 20:17:22.001700  408464 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.49.0/24 --gateway=192.168.49.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=functional-682596 functional-682596
	I1217 20:17:22.059151  408464 network_create.go:108] docker network functional-682596 192.168.49.0/24 created
	I1217 20:17:22.059174  408464 kic.go:121] calculated static IP "192.168.49.2" for the "functional-682596" container
	I1217 20:17:22.059249  408464 cli_runner.go:164] Run: docker ps -a --format {{.Names}}
	I1217 20:17:22.076475  408464 cli_runner.go:164] Run: docker volume create functional-682596 --label name.minikube.sigs.k8s.io=functional-682596 --label created_by.minikube.sigs.k8s.io=true
	I1217 20:17:22.096119  408464 oci.go:103] Successfully created a docker volume functional-682596
	I1217 20:17:22.096207  408464 cli_runner.go:164] Run: docker run --rm --name functional-682596-preload-sidecar --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=functional-682596 --entrypoint /usr/bin/test -v functional-682596:/var gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 -d /var/lib
	I1217 20:17:22.638069  408464 oci.go:107] Successfully prepared a docker volume functional-682596
	I1217 20:17:22.638130  408464 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:17:22.638148  408464 kic.go:194] Starting extracting preloaded images to volume ...
	I1217 20:17:22.638211  408464 cli_runner.go:164] Run: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4:/preloaded.tar:ro -v functional-682596:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 -I lz4 -xf /preloaded.tar -C /extractDir
	I1217 20:17:26.699436  408464 cli_runner.go:217] Completed: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4:/preloaded.tar:ro -v functional-682596:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 -I lz4 -xf /preloaded.tar -C /extractDir: (4.061190772s)
	I1217 20:17:26.699458  408464 kic.go:203] duration metric: took 4.061316877s to extract preloaded images to volume ...
	W1217 20:17:26.699606  408464 cgroups_linux.go:77] Your kernel does not support swap limit capabilities or the cgroup is not mounted.
	I1217 20:17:26.699743  408464 cli_runner.go:164] Run: docker info --format "'{{json .SecurityOptions}}'"
	I1217 20:17:26.760160  408464 cli_runner.go:164] Run: docker run -d -t --privileged --security-opt seccomp=unconfined --tmpfs /tmp --tmpfs /run -v /lib/modules:/lib/modules:ro --hostname functional-682596 --name functional-682596 --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=functional-682596 --label role.minikube.sigs.k8s.io= --label mode.minikube.sigs.k8s.io=functional-682596 --network functional-682596 --ip 192.168.49.2 --volume functional-682596:/var --security-opt apparmor=unconfined --memory=4096mb --cpus=2 -e container=docker --expose 8441 --publish=127.0.0.1::8441 --publish=127.0.0.1::22 --publish=127.0.0.1::2376 --publish=127.0.0.1::5000 --publish=127.0.0.1::32443 gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78
	I1217 20:17:27.084701  408464 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Running}}
	I1217 20:17:27.110049  408464 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:17:27.134409  408464 cli_runner.go:164] Run: docker exec functional-682596 stat /var/lib/dpkg/alternatives/iptables
	I1217 20:17:27.181682  408464 oci.go:144] the created container "functional-682596" has a running status.
	I1217 20:17:27.181701  408464 kic.go:225] Creating ssh key for kic: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa...
	I1217 20:17:27.464023  408464 kic_runner.go:191] docker (temp): /home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa.pub --> /home/docker/.ssh/authorized_keys (381 bytes)
	I1217 20:17:27.491072  408464 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:17:27.515341  408464 kic_runner.go:93] Run: chown docker:docker /home/docker/.ssh/authorized_keys
	I1217 20:17:27.515353  408464 kic_runner.go:114] Args: [docker exec --privileged functional-682596 chown docker:docker /home/docker/.ssh/authorized_keys]
	I1217 20:17:27.581169  408464 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:17:27.607895  408464 machine.go:94] provisionDockerMachine start ...
	I1217 20:17:27.607991  408464 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:17:27.628691  408464 main.go:143] libmachine: Using SSH client type: native
	I1217 20:17:27.629019  408464 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:17:27.629025  408464 main.go:143] libmachine: About to run SSH command:
	hostname
	I1217 20:17:27.633580  408464 main.go:143] libmachine: Error dialing TCP: ssh: handshake failed: EOF
	I1217 20:17:30.763773  408464 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:17:30.763787  408464 ubuntu.go:182] provisioning hostname "functional-682596"
	I1217 20:17:30.763850  408464 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:17:30.781985  408464 main.go:143] libmachine: Using SSH client type: native
	I1217 20:17:30.782415  408464 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:17:30.782428  408464 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-682596 && echo "functional-682596" | sudo tee /etc/hostname
	I1217 20:17:30.925683  408464 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:17:30.925754  408464 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:17:30.943805  408464 main.go:143] libmachine: Using SSH client type: native
	I1217 20:17:30.944107  408464 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:17:30.944123  408464 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-682596' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-682596/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-682596' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1217 20:17:31.077007  408464 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1217 20:17:31.077026  408464 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21808-367595/.minikube CaCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21808-367595/.minikube}
	I1217 20:17:31.077044  408464 ubuntu.go:190] setting up certificates
	I1217 20:17:31.077052  408464 provision.go:84] configureAuth start
	I1217 20:17:31.077131  408464 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:17:31.096361  408464 provision.go:143] copyHostCerts
	I1217 20:17:31.096424  408464 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem, removing ...
	I1217 20:17:31.096432  408464 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem
	I1217 20:17:31.096514  408464 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem (1082 bytes)
	I1217 20:17:31.096614  408464 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem, removing ...
	I1217 20:17:31.096618  408464 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem
	I1217 20:17:31.096645  408464 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem (1123 bytes)
	I1217 20:17:31.096705  408464 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem, removing ...
	I1217 20:17:31.096709  408464 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem
	I1217 20:17:31.096738  408464 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem (1679 bytes)
	I1217 20:17:31.096808  408464 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem org=jenkins.functional-682596 san=[127.0.0.1 192.168.49.2 functional-682596 localhost minikube]
	I1217 20:17:31.215483  408464 provision.go:177] copyRemoteCerts
	I1217 20:17:31.215557  408464 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1217 20:17:31.215596  408464 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:17:31.232538  408464 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:17:31.328090  408464 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1217 20:17:31.345537  408464 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I1217 20:17:31.362711  408464 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1217 20:17:31.379536  408464 provision.go:87] duration metric: took 302.460445ms to configureAuth
	I1217 20:17:31.379554  408464 ubuntu.go:206] setting minikube options for container-runtime
	I1217 20:17:31.379749  408464 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:17:31.379755  408464 machine.go:97] duration metric: took 3.771849448s to provisionDockerMachine
	I1217 20:17:31.379760  408464 client.go:176] duration metric: took 9.427287397s to LocalClient.Create
	I1217 20:17:31.379784  408464 start.go:167] duration metric: took 9.427335906s to libmachine.API.Create "functional-682596"
	I1217 20:17:31.379790  408464 start.go:293] postStartSetup for "functional-682596" (driver="docker")
	I1217 20:17:31.379800  408464 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1217 20:17:31.379847  408464 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1217 20:17:31.379881  408464 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:17:31.396723  408464 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:17:31.496364  408464 ssh_runner.go:195] Run: cat /etc/os-release
	I1217 20:17:31.499676  408464 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1217 20:17:31.499694  408464 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1217 20:17:31.499704  408464 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/addons for local assets ...
	I1217 20:17:31.499761  408464 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/files for local assets ...
	I1217 20:17:31.499846  408464 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> 3694612.pem in /etc/ssl/certs
	I1217 20:17:31.499926  408464 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts -> hosts in /etc/test/nested/copy/369461
	I1217 20:17:31.499972  408464 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/369461
	I1217 20:17:31.507593  408464 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:17:31.525966  408464 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts --> /etc/test/nested/copy/369461/hosts (40 bytes)
	I1217 20:17:31.545730  408464 start.go:296] duration metric: took 165.925388ms for postStartSetup
	I1217 20:17:31.546127  408464 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:17:31.567546  408464 profile.go:143] Saving config to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/config.json ...
	I1217 20:17:31.567837  408464 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1217 20:17:31.567879  408464 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:17:31.584998  408464 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:17:31.677573  408464 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1217 20:17:31.682686  408464 start.go:128] duration metric: took 9.733887778s to createHost
	I1217 20:17:31.682701  408464 start.go:83] releasing machines lock for "functional-682596", held for 9.733999296s
	I1217 20:17:31.682772  408464 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:17:31.704525  408464 out.go:179] * Found network options:
	I1217 20:17:31.707558  408464 out.go:179]   - HTTP_PROXY=localhost:34181
	W1217 20:17:31.710383  408464 out.go:285] ! You appear to be using a proxy, but your NO_PROXY environment does not include the minikube IP (192.168.49.2).
	I1217 20:17:31.713311  408464 out.go:179] * Please see https://minikube.sigs.k8s.io/docs/handbook/vpn_and_proxy/ for more details
	I1217 20:17:31.716190  408464 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:17:31.716239  408464 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:17:31.716314  408464 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:17:31.716354  408464 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:17:31.716380  408464 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:17:31.716407  408464 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:17:31.716451  408464 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:17:31.716520  408464 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:17:31.716579  408464 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:17:31.733697  408464 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:17:31.842902  408464 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:17:31.861156  408464 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:17:31.881574  408464 ssh_runner.go:195] Run: openssl version
	I1217 20:17:31.888051  408464 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:17:31.896077  408464 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:17:31.903494  408464 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:17:31.907204  408464 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:17:31.907271  408464 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:17:31.948081  408464 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:17:31.955692  408464 ssh_runner.go:195] Run: sudo ln -fs /etc/ssl/certs/369461.pem /etc/ssl/certs/51391683.0
	I1217 20:17:31.962750  408464 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:17:31.970050  408464 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:17:31.977561  408464 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:17:31.981522  408464 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:17:31.981575  408464 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:17:32.023052  408464 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:17:32.031209  408464 ssh_runner.go:195] Run: sudo ln -fs /etc/ssl/certs/3694612.pem /etc/ssl/certs/3ec20f2e.0
	I1217 20:17:32.039068  408464 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:17:32.047122  408464 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:17:32.055001  408464 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:17:32.059062  408464 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:17:32.059119  408464 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:17:32.104846  408464 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:17:32.112502  408464 ssh_runner.go:195] Run: sudo ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0
	I1217 20:17:32.121041  408464 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-certificates >/dev/null 2>&1 && sudo update-ca-certificates || true"
	I1217 20:17:32.124629  408464 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-trust >/dev/null 2>&1 && sudo update-ca-trust extract || true"
	I1217 20:17:32.127988  408464 ssh_runner.go:195] Run: cat /version.json
	I1217 20:17:32.128068  408464 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1217 20:17:32.219904  408464 ssh_runner.go:195] Run: systemctl --version
	I1217 20:17:32.226268  408464 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1217 20:17:32.230643  408464 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1217 20:17:32.230716  408464 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1217 20:17:32.257253  408464 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist, /etc/cni/net.d/10-crio-bridge.conflist.disabled] bridge cni config(s)
	I1217 20:17:32.257278  408464 start.go:496] detecting cgroup driver to use...
	I1217 20:17:32.257309  408464 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1217 20:17:32.257369  408464 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1217 20:17:32.272465  408464 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1217 20:17:32.285266  408464 docker.go:218] disabling cri-docker service (if available) ...
	I1217 20:17:32.285319  408464 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1217 20:17:32.302464  408464 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1217 20:17:32.320864  408464 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1217 20:17:32.427231  408464 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1217 20:17:32.549086  408464 docker.go:234] disabling docker service ...
	I1217 20:17:32.549147  408464 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1217 20:17:32.570903  408464 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1217 20:17:32.584566  408464 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1217 20:17:32.695913  408464 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1217 20:17:32.811758  408464 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1217 20:17:32.824697  408464 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1217 20:17:32.838329  408464 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1217 20:17:32.847167  408464 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1217 20:17:32.856134  408464 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1217 20:17:32.856193  408464 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1217 20:17:32.864946  408464 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:17:32.874040  408464 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1217 20:17:32.883288  408464 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:17:32.892334  408464 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1217 20:17:32.900384  408464 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1217 20:17:32.909100  408464 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1217 20:17:32.917753  408464 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1217 20:17:32.926759  408464 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1217 20:17:32.934238  408464 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1217 20:17:32.941415  408464 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:17:33.051481  408464 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1217 20:17:33.185216  408464 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1217 20:17:33.185277  408464 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1217 20:17:33.190164  408464 start.go:564] Will wait 60s for crictl version
	I1217 20:17:33.190229  408464 ssh_runner.go:195] Run: which crictl
	I1217 20:17:33.193950  408464 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1217 20:17:33.217208  408464 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1217 20:17:33.217268  408464 ssh_runner.go:195] Run: containerd --version
	I1217 20:17:33.239042  408464 ssh_runner.go:195] Run: containerd --version
	I1217 20:17:33.262252  408464 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1217 20:17:33.265249  408464 cli_runner.go:164] Run: docker network inspect functional-682596 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1217 20:17:33.284021  408464 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1217 20:17:33.287934  408464 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.49.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1217 20:17:33.297762  408464 kubeadm.go:884] updating cluster {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false C
ustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1217 20:17:33.297893  408464 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:17:33.297965  408464 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:17:33.321874  408464 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:17:33.321886  408464 containerd.go:534] Images already preloaded, skipping extraction
	I1217 20:17:33.321950  408464 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:17:33.348051  408464 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:17:33.348062  408464 cache_images.go:86] Images are preloaded, skipping loading
	I1217 20:17:33.348068  408464 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1217 20:17:33.348178  408464 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-682596 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1217 20:17:33.348244  408464 ssh_runner.go:195] Run: sudo crictl info
	I1217 20:17:33.373604  408464 cni.go:84] Creating CNI manager for ""
	I1217 20:17:33.373612  408464 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:17:33.373619  408464 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1217 20:17:33.373639  408464 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-682596 NodeName:functional-682596 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Stati
cPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1217 20:17:33.373754  408464 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-682596"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1217 20:17:33.373819  408464 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1217 20:17:33.381695  408464 binaries.go:51] Found k8s binaries, skipping transfer
	I1217 20:17:33.381758  408464 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1217 20:17:33.389618  408464 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1217 20:17:33.402477  408464 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1217 20:17:33.415211  408464 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2235 bytes)
	I1217 20:17:33.428039  408464 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1217 20:17:33.432152  408464 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.49.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1217 20:17:33.442688  408464 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:17:33.551422  408464 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1217 20:17:33.569132  408464 certs.go:69] Setting up /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596 for IP: 192.168.49.2
	I1217 20:17:33.569142  408464 certs.go:195] generating shared ca certs ...
	I1217 20:17:33.569158  408464 certs.go:227] acquiring lock for ca certs: {Name:mk528c7ee25f2f3d78de33f266a77f908cb2a9d0 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:17:33.569295  408464 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key
	I1217 20:17:33.569343  408464 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key
	I1217 20:17:33.569348  408464 certs.go:257] generating profile certs ...
	I1217 20:17:33.569403  408464 certs.go:364] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key
	I1217 20:17:33.569412  408464 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt with IP's: []
	I1217 20:17:34.102238  408464 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt ...
	I1217 20:17:34.102254  408464 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: {Name:mk10b4ae3de6bc0fd053aefbfebc11b5e94ecf32 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:17:34.102460  408464 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key ...
	I1217 20:17:34.102466  408464 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key: {Name:mkfbebb82964dd97030ecf6f640f403a9688684b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:17:34.102554  408464 certs.go:364] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key.0c30bf8d
	I1217 20:17:34.102567  408464 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt.0c30bf8d with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.49.2]
	I1217 20:17:34.514245  408464 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt.0c30bf8d ...
	I1217 20:17:34.514261  408464 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt.0c30bf8d: {Name:mk8c85d7b466da52cc57fbefe18f1b22b0f2142b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:17:34.514459  408464 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key.0c30bf8d ...
	I1217 20:17:34.514468  408464 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key.0c30bf8d: {Name:mk3a794df56061feee53f25aee3143027fa0e637 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:17:34.514549  408464 certs.go:382] copying /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt.0c30bf8d -> /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt
	I1217 20:17:34.514622  408464 certs.go:386] copying /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key.0c30bf8d -> /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key
	I1217 20:17:34.514673  408464 certs.go:364] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key
	I1217 20:17:34.514686  408464 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.crt with IP's: []
	I1217 20:17:34.732693  408464 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.crt ...
	I1217 20:17:34.732708  408464 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.crt: {Name:mk4c2ecdd670098ee36bc4877b5429132f71772f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:17:34.732897  408464 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key ...
	I1217 20:17:34.732905  408464 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key: {Name:mkcaa0052c55452343df34bcc47abf04c9c129bf Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:17:34.733090  408464 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:17:34.733167  408464 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:17:34.733174  408464 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:17:34.733201  408464 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:17:34.733224  408464 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:17:34.733248  408464 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:17:34.733293  408464 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:17:34.733899  408464 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1217 20:17:34.752811  408464 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1217 20:17:34.770536  408464 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1217 20:17:34.788392  408464 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1217 20:17:34.806750  408464 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1217 20:17:34.824539  408464 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1217 20:17:34.842740  408464 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1217 20:17:34.860711  408464 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1217 20:17:34.878384  408464 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:17:34.895186  408464 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:17:34.912484  408464 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:17:34.930719  408464 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1217 20:17:34.943909  408464 ssh_runner.go:195] Run: openssl version
	I1217 20:17:34.950422  408464 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:17:34.957881  408464 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:17:34.965315  408464 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:17:34.969148  408464 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:17:34.969206  408464 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:17:35.011866  408464 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:17:35.020380  408464 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:17:35.028412  408464 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:17:35.036483  408464 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:17:35.040994  408464 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:17:35.041050  408464 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:17:35.082880  408464 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:17:35.090758  408464 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:17:35.098511  408464 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:17:35.106244  408464 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:17:35.110074  408464 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:17:35.110139  408464 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:17:35.151546  408464 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:17:35.159189  408464 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1217 20:17:35.162712  408464 certs.go:400] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I1217 20:17:35.162768  408464 kubeadm.go:401] StartCluster: {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Cust
omQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:17:35.162847  408464 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1217 20:17:35.162908  408464 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1217 20:17:35.194033  408464 cri.go:89] found id: ""
	I1217 20:17:35.194106  408464 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1217 20:17:35.201873  408464 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1217 20:17:35.209373  408464 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1217 20:17:35.209426  408464 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 20:17:35.217028  408464 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1217 20:17:35.217037  408464 kubeadm.go:158] found existing configuration files:
	
	I1217 20:17:35.217093  408464 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1217 20:17:35.224840  408464 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1217 20:17:35.224906  408464 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1217 20:17:35.233070  408464 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1217 20:17:35.241440  408464 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1217 20:17:35.241495  408464 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 20:17:35.248882  408464 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1217 20:17:35.257370  408464 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1217 20:17:35.257477  408464 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 20:17:35.265788  408464 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1217 20:17:35.274397  408464 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1217 20:17:35.274453  408464 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 20:17:35.282878  408464 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1217 20:17:35.323242  408464 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1217 20:17:35.323292  408464 kubeadm.go:319] [preflight] Running pre-flight checks
	I1217 20:17:35.394184  408464 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1217 20:17:35.394262  408464 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1217 20:17:35.394308  408464 kubeadm.go:319] OS: Linux
	I1217 20:17:35.394353  408464 kubeadm.go:319] CGROUPS_CPU: enabled
	I1217 20:17:35.394402  408464 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1217 20:17:35.394447  408464 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1217 20:17:35.394504  408464 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1217 20:17:35.394574  408464 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1217 20:17:35.394633  408464 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1217 20:17:35.394683  408464 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1217 20:17:35.394733  408464 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1217 20:17:35.394808  408464 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1217 20:17:35.465742  408464 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1217 20:17:35.465845  408464 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1217 20:17:35.465939  408464 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1217 20:17:35.474171  408464 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1217 20:17:35.480628  408464 out.go:252]   - Generating certificates and keys ...
	I1217 20:17:35.480717  408464 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1217 20:17:35.480780  408464 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1217 20:17:35.734680  408464 kubeadm.go:319] [certs] Generating "apiserver-kubelet-client" certificate and key
	I1217 20:17:36.263127  408464 kubeadm.go:319] [certs] Generating "front-proxy-ca" certificate and key
	I1217 20:17:36.626466  408464 kubeadm.go:319] [certs] Generating "front-proxy-client" certificate and key
	I1217 20:17:37.356805  408464 kubeadm.go:319] [certs] Generating "etcd/ca" certificate and key
	I1217 20:17:37.545588  408464 kubeadm.go:319] [certs] Generating "etcd/server" certificate and key
	I1217 20:17:37.545927  408464 kubeadm.go:319] [certs] etcd/server serving cert is signed for DNS names [functional-682596 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1217 20:17:37.745652  408464 kubeadm.go:319] [certs] Generating "etcd/peer" certificate and key
	I1217 20:17:37.745806  408464 kubeadm.go:319] [certs] etcd/peer serving cert is signed for DNS names [functional-682596 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1217 20:17:37.888743  408464 kubeadm.go:319] [certs] Generating "etcd/healthcheck-client" certificate and key
	I1217 20:17:37.931874  408464 kubeadm.go:319] [certs] Generating "apiserver-etcd-client" certificate and key
	I1217 20:17:38.027072  408464 kubeadm.go:319] [certs] Generating "sa" key and public key
	I1217 20:17:38.027399  408464 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1217 20:17:38.192366  408464 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1217 20:17:38.903163  408464 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1217 20:17:39.404352  408464 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1217 20:17:39.670662  408464 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1217 20:17:40.147373  408464 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1217 20:17:40.148028  408464 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1217 20:17:40.151613  408464 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1217 20:17:40.155025  408464 out.go:252]   - Booting up control plane ...
	I1217 20:17:40.155150  408464 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1217 20:17:40.155238  408464 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1217 20:17:40.156028  408464 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1217 20:17:40.186779  408464 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1217 20:17:40.186880  408464 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1217 20:17:40.195235  408464 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1217 20:17:40.195328  408464 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1217 20:17:40.195371  408464 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1217 20:17:40.340340  408464 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1217 20:17:40.340452  408464 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1217 20:21:40.340538  408464 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000237436s
	I1217 20:21:40.340558  408464 kubeadm.go:319] 
	I1217 20:21:40.340614  408464 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1217 20:21:40.340646  408464 kubeadm.go:319] 	- The kubelet is not running
	I1217 20:21:40.340750  408464 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1217 20:21:40.340753  408464 kubeadm.go:319] 
	I1217 20:21:40.340856  408464 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1217 20:21:40.340887  408464 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1217 20:21:40.340917  408464 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1217 20:21:40.340920  408464 kubeadm.go:319] 
	I1217 20:21:40.346040  408464 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1217 20:21:40.347665  408464 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1217 20:21:40.347809  408464 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1217 20:21:40.348050  408464 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1217 20:21:40.348054  408464 kubeadm.go:319] 
	I1217 20:21:40.348161  408464 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1217 20:21:40.348291  408464 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Generating "apiserver-kubelet-client" certificate and key
	[certs] Generating "front-proxy-ca" certificate and key
	[certs] Generating "front-proxy-client" certificate and key
	[certs] Generating "etcd/ca" certificate and key
	[certs] Generating "etcd/server" certificate and key
	[certs] etcd/server serving cert is signed for DNS names [functional-682596 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/peer" certificate and key
	[certs] etcd/peer serving cert is signed for DNS names [functional-682596 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/healthcheck-client" certificate and key
	[certs] Generating "apiserver-etcd-client" certificate and key
	[certs] Generating "sa" key and public key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000237436s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1217 20:21:40.348389  408464 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1217 20:21:40.763345  408464 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1217 20:21:40.776730  408464 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1217 20:21:40.776787  408464 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 20:21:40.784760  408464 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1217 20:21:40.784768  408464 kubeadm.go:158] found existing configuration files:
	
	I1217 20:21:40.784817  408464 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1217 20:21:40.792613  408464 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1217 20:21:40.792673  408464 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1217 20:21:40.799938  408464 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1217 20:21:40.808051  408464 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1217 20:21:40.808107  408464 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 20:21:40.815869  408464 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1217 20:21:40.823548  408464 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1217 20:21:40.823616  408464 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 20:21:40.831013  408464 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1217 20:21:40.838927  408464 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1217 20:21:40.838983  408464 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 20:21:40.846562  408464 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1217 20:21:40.884726  408464 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1217 20:21:40.885008  408464 kubeadm.go:319] [preflight] Running pre-flight checks
	I1217 20:21:40.961521  408464 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1217 20:21:40.961585  408464 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1217 20:21:40.961620  408464 kubeadm.go:319] OS: Linux
	I1217 20:21:40.961666  408464 kubeadm.go:319] CGROUPS_CPU: enabled
	I1217 20:21:40.961714  408464 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1217 20:21:40.961760  408464 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1217 20:21:40.961807  408464 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1217 20:21:40.961854  408464 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1217 20:21:40.961901  408464 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1217 20:21:40.961945  408464 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1217 20:21:40.961992  408464 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1217 20:21:40.962037  408464 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1217 20:21:41.033198  408464 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1217 20:21:41.033325  408464 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1217 20:21:41.033432  408464 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1217 20:21:41.040705  408464 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1217 20:21:41.046212  408464 out.go:252]   - Generating certificates and keys ...
	I1217 20:21:41.046320  408464 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1217 20:21:41.046404  408464 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1217 20:21:41.046501  408464 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1217 20:21:41.046571  408464 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1217 20:21:41.046647  408464 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1217 20:21:41.046704  408464 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1217 20:21:41.046770  408464 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1217 20:21:41.046834  408464 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1217 20:21:41.046913  408464 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1217 20:21:41.046989  408464 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1217 20:21:41.047028  408464 kubeadm.go:319] [certs] Using the existing "sa" key
	I1217 20:21:41.047091  408464 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1217 20:21:41.186421  408464 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1217 20:21:41.352321  408464 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1217 20:21:41.659899  408464 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1217 20:21:41.780571  408464 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1217 20:21:41.907324  408464 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1217 20:21:41.907929  408464 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1217 20:21:41.911536  408464 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1217 20:21:41.914828  408464 out.go:252]   - Booting up control plane ...
	I1217 20:21:41.914926  408464 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1217 20:21:41.915003  408464 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1217 20:21:41.915715  408464 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1217 20:21:41.936212  408464 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1217 20:21:41.936423  408464 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1217 20:21:41.945262  408464 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1217 20:21:41.945549  408464 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1217 20:21:41.945738  408464 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1217 20:21:42.092879  408464 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1217 20:21:42.093114  408464 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1217 20:25:42.088415  408464 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000252088s
	I1217 20:25:42.088438  408464 kubeadm.go:319] 
	I1217 20:25:42.088494  408464 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1217 20:25:42.088527  408464 kubeadm.go:319] 	- The kubelet is not running
	I1217 20:25:42.088631  408464 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1217 20:25:42.088635  408464 kubeadm.go:319] 
	I1217 20:25:42.088738  408464 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1217 20:25:42.088770  408464 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1217 20:25:42.088800  408464 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1217 20:25:42.088803  408464 kubeadm.go:319] 
	I1217 20:25:42.094846  408464 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1217 20:25:42.095369  408464 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1217 20:25:42.095514  408464 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1217 20:25:42.095757  408464 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1217 20:25:42.095762  408464 kubeadm.go:319] 
	I1217 20:25:42.095910  408464 kubeadm.go:403] duration metric: took 8m6.933150928s to StartCluster
	I1217 20:25:42.095948  408464 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1217 20:25:42.095981  408464 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:25:42.096094  408464 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:25:42.125693  408464 cri.go:89] found id: ""
	I1217 20:25:42.125709  408464 logs.go:282] 0 containers: []
	W1217 20:25:42.125719  408464 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:25:42.125726  408464 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:25:42.125948  408464 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:25:42.158251  408464 cri.go:89] found id: ""
	I1217 20:25:42.158266  408464 logs.go:282] 0 containers: []
	W1217 20:25:42.158274  408464 logs.go:284] No container was found matching "etcd"
	I1217 20:25:42.158281  408464 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:25:42.158354  408464 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:25:42.188180  408464 cri.go:89] found id: ""
	I1217 20:25:42.188196  408464 logs.go:282] 0 containers: []
	W1217 20:25:42.188204  408464 logs.go:284] No container was found matching "coredns"
	I1217 20:25:42.188210  408464 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:25:42.188308  408464 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:25:42.216858  408464 cri.go:89] found id: ""
	I1217 20:25:42.216874  408464 logs.go:282] 0 containers: []
	W1217 20:25:42.216882  408464 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:25:42.216887  408464 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:25:42.216958  408464 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:25:42.253901  408464 cri.go:89] found id: ""
	I1217 20:25:42.253916  408464 logs.go:282] 0 containers: []
	W1217 20:25:42.253924  408464 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:25:42.253930  408464 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:25:42.254000  408464 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:25:42.284974  408464 cri.go:89] found id: ""
	I1217 20:25:42.284988  408464 logs.go:282] 0 containers: []
	W1217 20:25:42.284996  408464 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:25:42.285001  408464 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:25:42.285063  408464 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:25:42.318584  408464 cri.go:89] found id: ""
	I1217 20:25:42.318597  408464 logs.go:282] 0 containers: []
	W1217 20:25:42.318605  408464 logs.go:284] No container was found matching "kindnet"
	I1217 20:25:42.318614  408464 logs.go:123] Gathering logs for dmesg ...
	I1217 20:25:42.318651  408464 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:25:42.334655  408464 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:25:42.334672  408464 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:25:42.399188  408464 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:25:42.390062    4838 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:25:42.390716    4838 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:25:42.392768    4838 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:25:42.393411    4838 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:25:42.395142    4838 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:25:42.390062    4838 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:25:42.390716    4838 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:25:42.392768    4838 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:25:42.393411    4838 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:25:42.395142    4838 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:25:42.399200  408464 logs.go:123] Gathering logs for containerd ...
	I1217 20:25:42.399210  408464 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:25:42.438269  408464 logs.go:123] Gathering logs for container status ...
	I1217 20:25:42.438288  408464 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:25:42.470620  408464 logs.go:123] Gathering logs for kubelet ...
	I1217 20:25:42.470637  408464 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	W1217 20:25:42.527492  408464 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000252088s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1217 20:25:42.527537  408464 out.go:285] * 
	W1217 20:25:42.527602  408464 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000252088s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1217 20:25:42.527679  408464 out.go:285] * 
	W1217 20:25:42.529802  408464 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1217 20:25:42.535585  408464 out.go:203] 
	W1217 20:25:42.539273  408464 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000252088s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1217 20:25:42.539324  408464 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1217 20:25:42.539346  408464 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1217 20:25:42.543078  408464 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.126766889Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.126785712Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.126836913Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.126853397Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.126874082Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.127012004Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.127034593Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.127052102Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.127074536Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.127112444Z" level=info msg="Connect containerd service"
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.127526872Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.128317362Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.139998451Z" level=info msg="Start subscribing containerd event"
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.140234146Z" level=info msg="Start recovering state"
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.140044138Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.140549382Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.181957692Z" level=info msg="Start event monitor"
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.182010460Z" level=info msg="Start cni network conf syncer for default"
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.182020421Z" level=info msg="Start streaming server"
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.182034986Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.182044807Z" level=info msg="runtime interface starting up..."
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.182050887Z" level=info msg="starting plugins..."
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.182063228Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 17 20:17:33 functional-682596 containerd[807]: time="2025-12-17T20:17:33.182352470Z" level=info msg="containerd successfully booted in 0.080223s"
	Dec 17 20:17:33 functional-682596 systemd[1]: Started containerd.service - containerd container runtime.
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:25:43.509425    4956 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:25:43.509815    4956 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:25:43.511450    4956 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:25:43.511951    4956 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:25:43.513539    4956 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec17 17:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015536] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.514164] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034184] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.806183] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.649674] kauditd_printk_skb: 36 callbacks suppressed
	[Dec17 19:37] hrtimer: interrupt took 15014583 ns
	[Dec17 19:39] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:06] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:17] FS-Cache: Duplicate cookie detected
	[  +0.000767] FS-Cache: O-cookie c=00000031 [p=00000002 fl=222 nc=0 na=1]
	[  +0.001036] FS-Cache: O-cookie d=00000000b1f70094{9P.session} n=000000004124fba5
	[  +0.001177] FS-Cache: O-key=[10] '34323937353834383437'
	[  +0.000816] FS-Cache: N-cookie c=00000032 [p=00000002 fl=2 nc=0 na=1]
	[  +0.001043] FS-Cache: N-cookie d=00000000b1f70094{9P.session} n=000000009cece4cf
	[  +0.001160] FS-Cache: N-key=[10] '34323937353834383437'
	
	
	==> kernel <==
	 20:25:43 up  3:08,  0 user,  load average: 0.38, 0.54, 1.04
	Linux functional-682596 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 17 20:25:40 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:25:40 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 318.
	Dec 17 20:25:40 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:25:40 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:25:40 functional-682596 kubelet[4763]: E1217 20:25:40.781915    4763 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:25:40 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:25:40 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:25:41 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 319.
	Dec 17 20:25:41 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:25:41 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:25:41 functional-682596 kubelet[4768]: E1217 20:25:41.536868    4768 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:25:41 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:25:41 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:25:42 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 320.
	Dec 17 20:25:42 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:25:42 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:25:42 functional-682596 kubelet[4809]: E1217 20:25:42.311973    4809 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:25:42 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:25:42 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:25:42 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 321.
	Dec 17 20:25:42 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:25:42 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:25:43 functional-682596 kubelet[4872]: E1217 20:25:43.058057    4872 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:25:43 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:25:43 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596: exit status 6 (348.063149ms)

                                                
                                                
-- stdout --
	Stopped
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E1217 20:25:43.981842  414214 status.go:458] kubeconfig endpoint: get endpoint: "functional-682596" does not appear in /home/jenkins/minikube-integration/21808-367595/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:263: status error: exit status 6 (may be ok)
helpers_test.go:265: "functional-682596" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy (502.31s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart (368.2s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart
I1217 20:25:43.998521  369461 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
functional_test.go:674: (dbg) Run:  out/minikube-linux-arm64 start -p functional-682596 --alsologtostderr -v=8
E1217 20:26:28.507542  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:26:56.210463  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:29:49.015367  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:31:12.091448  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:31:28.507510  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:674: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-682596 --alsologtostderr -v=8: exit status 80 (6m5.26207472s)

                                                
                                                
-- stdout --
	* [functional-682596] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21808
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	* Starting "functional-682596" primary control-plane node in "functional-682596" cluster
	* Pulling base image v0.0.48-1765661130-22141 ...
	* Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	* Verifying Kubernetes components...
	  - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	* Enabled addons: 
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1217 20:25:44.045489  414292 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:25:44.045686  414292 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:25:44.045714  414292 out.go:374] Setting ErrFile to fd 2...
	I1217 20:25:44.045733  414292 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:25:44.046029  414292 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:25:44.046470  414292 out.go:368] Setting JSON to false
	I1217 20:25:44.047409  414292 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":11289,"bootTime":1765991855,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:25:44.047515  414292 start.go:143] virtualization:  
	I1217 20:25:44.053027  414292 out.go:179] * [functional-682596] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1217 20:25:44.056011  414292 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 20:25:44.056093  414292 notify.go:221] Checking for updates...
	I1217 20:25:44.061883  414292 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:25:44.064833  414292 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:44.067589  414292 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:25:44.070446  414292 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 20:25:44.073380  414292 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 20:25:44.076968  414292 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:25:44.077128  414292 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:25:44.112208  414292 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:25:44.112455  414292 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:25:44.167112  414292 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 20:25:44.158029599 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:25:44.167209  414292 docker.go:319] overlay module found
	I1217 20:25:44.170171  414292 out.go:179] * Using the docker driver based on existing profile
	I1217 20:25:44.173086  414292 start.go:309] selected driver: docker
	I1217 20:25:44.173109  414292 start.go:927] validating driver "docker" against &{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableC
oreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:25:44.173214  414292 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 20:25:44.173330  414292 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:25:44.234258  414292 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 20:25:44.225129855 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:25:44.234785  414292 cni.go:84] Creating CNI manager for ""
	I1217 20:25:44.234848  414292 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:25:44.234909  414292 start.go:353] cluster config:
	{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath:
StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:25:44.238034  414292 out.go:179] * Starting "functional-682596" primary control-plane node in "functional-682596" cluster
	I1217 20:25:44.240853  414292 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1217 20:25:44.243760  414292 out.go:179] * Pulling base image v0.0.48-1765661130-22141 ...
	I1217 20:25:44.246713  414292 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:25:44.246768  414292 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1217 20:25:44.246782  414292 cache.go:65] Caching tarball of preloaded images
	I1217 20:25:44.246797  414292 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon
	I1217 20:25:44.246869  414292 preload.go:238] Found /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1217 20:25:44.246880  414292 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1217 20:25:44.246994  414292 profile.go:143] Saving config to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/config.json ...
	I1217 20:25:44.265764  414292 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon, skipping pull
	I1217 20:25:44.265789  414292 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 exists in daemon, skipping load
	I1217 20:25:44.265812  414292 cache.go:243] Successfully downloaded all kic artifacts
	I1217 20:25:44.265841  414292 start.go:360] acquireMachinesLock for functional-682596: {Name:mk49b95a4c72eb2d15a1ae0f35918a9843d0b3df Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1217 20:25:44.265903  414292 start.go:364] duration metric: took 36.013µs to acquireMachinesLock for "functional-682596"
	I1217 20:25:44.265927  414292 start.go:96] Skipping create...Using existing machine configuration
	I1217 20:25:44.265936  414292 fix.go:54] fixHost starting: 
	I1217 20:25:44.266187  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:44.282574  414292 fix.go:112] recreateIfNeeded on functional-682596: state=Running err=<nil>
	W1217 20:25:44.282603  414292 fix.go:138] unexpected machine state, will restart: <nil>
	I1217 20:25:44.285918  414292 out.go:252] * Updating the running docker "functional-682596" container ...
	I1217 20:25:44.285950  414292 machine.go:94] provisionDockerMachine start ...
	I1217 20:25:44.286031  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:44.302759  414292 main.go:143] libmachine: Using SSH client type: native
	I1217 20:25:44.303096  414292 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:25:44.303111  414292 main.go:143] libmachine: About to run SSH command:
	hostname
	I1217 20:25:44.431913  414292 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:25:44.431939  414292 ubuntu.go:182] provisioning hostname "functional-682596"
	I1217 20:25:44.432002  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:44.450770  414292 main.go:143] libmachine: Using SSH client type: native
	I1217 20:25:44.451117  414292 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:25:44.451136  414292 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-682596 && echo "functional-682596" | sudo tee /etc/hostname
	I1217 20:25:44.601580  414292 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:25:44.601732  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:44.619103  414292 main.go:143] libmachine: Using SSH client type: native
	I1217 20:25:44.619412  414292 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:25:44.619435  414292 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-682596' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-682596/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-682596' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1217 20:25:44.748545  414292 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1217 20:25:44.748571  414292 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21808-367595/.minikube CaCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21808-367595/.minikube}
	I1217 20:25:44.748593  414292 ubuntu.go:190] setting up certificates
	I1217 20:25:44.748603  414292 provision.go:84] configureAuth start
	I1217 20:25:44.748675  414292 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:25:44.766057  414292 provision.go:143] copyHostCerts
	I1217 20:25:44.766100  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem
	I1217 20:25:44.766141  414292 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem, removing ...
	I1217 20:25:44.766152  414292 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem
	I1217 20:25:44.766226  414292 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem (1082 bytes)
	I1217 20:25:44.766327  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem
	I1217 20:25:44.766347  414292 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem, removing ...
	I1217 20:25:44.766357  414292 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem
	I1217 20:25:44.766385  414292 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem (1123 bytes)
	I1217 20:25:44.766441  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem
	I1217 20:25:44.766461  414292 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem, removing ...
	I1217 20:25:44.766471  414292 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem
	I1217 20:25:44.766501  414292 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem (1679 bytes)
	I1217 20:25:44.766561  414292 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem org=jenkins.functional-682596 san=[127.0.0.1 192.168.49.2 functional-682596 localhost minikube]
	I1217 20:25:45.107844  414292 provision.go:177] copyRemoteCerts
	I1217 20:25:45.108657  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1217 20:25:45.108873  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.149674  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.277212  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1217 20:25:45.277284  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1217 20:25:45.298737  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1217 20:25:45.298796  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1217 20:25:45.320659  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1217 20:25:45.320720  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I1217 20:25:45.338755  414292 provision.go:87] duration metric: took 590.128101ms to configureAuth
	I1217 20:25:45.338800  414292 ubuntu.go:206] setting minikube options for container-runtime
	I1217 20:25:45.338978  414292 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:25:45.339040  414292 machine.go:97] duration metric: took 1.053082119s to provisionDockerMachine
	I1217 20:25:45.339048  414292 start.go:293] postStartSetup for "functional-682596" (driver="docker")
	I1217 20:25:45.339059  414292 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1217 20:25:45.339122  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1217 20:25:45.339165  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.356059  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.452345  414292 ssh_runner.go:195] Run: cat /etc/os-release
	I1217 20:25:45.455946  414292 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1217 20:25:45.455965  414292 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1217 20:25:45.455970  414292 command_runner.go:130] > VERSION_ID="12"
	I1217 20:25:45.455975  414292 command_runner.go:130] > VERSION="12 (bookworm)"
	I1217 20:25:45.455980  414292 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1217 20:25:45.455983  414292 command_runner.go:130] > ID=debian
	I1217 20:25:45.455989  414292 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1217 20:25:45.455994  414292 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1217 20:25:45.456008  414292 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1217 20:25:45.456046  414292 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1217 20:25:45.456062  414292 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1217 20:25:45.456073  414292 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/addons for local assets ...
	I1217 20:25:45.456130  414292 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/files for local assets ...
	I1217 20:25:45.456208  414292 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> 3694612.pem in /etc/ssl/certs
	I1217 20:25:45.456215  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> /etc/ssl/certs/3694612.pem
	I1217 20:25:45.456308  414292 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts -> hosts in /etc/test/nested/copy/369461
	I1217 20:25:45.456313  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts -> /etc/test/nested/copy/369461/hosts
	I1217 20:25:45.456356  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/369461
	I1217 20:25:45.464083  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:25:45.481460  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts --> /etc/test/nested/copy/369461/hosts (40 bytes)
	I1217 20:25:45.500420  414292 start.go:296] duration metric: took 161.357637ms for postStartSetup
	I1217 20:25:45.500542  414292 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1217 20:25:45.500615  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.517677  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.609195  414292 command_runner.go:130] > 18%
	I1217 20:25:45.609800  414292 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1217 20:25:45.614741  414292 command_runner.go:130] > 159G
	I1217 20:25:45.614774  414292 fix.go:56] duration metric: took 1.348835133s for fixHost
	I1217 20:25:45.614785  414292 start.go:83] releasing machines lock for "functional-682596", held for 1.348870218s
	I1217 20:25:45.614866  414292 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:25:45.631621  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:25:45.631685  414292 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:25:45.631702  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:25:45.631735  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:25:45.631767  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:25:45.631798  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:25:45.631848  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:25:45.631888  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.631907  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.631926  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem -> /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.631943  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:25:45.631995  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.649517  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.754346  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:25:45.772163  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:25:45.789636  414292 ssh_runner.go:195] Run: openssl version
	I1217 20:25:45.795706  414292 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1217 20:25:45.796203  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.803937  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:25:45.811516  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.815311  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.815389  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.815474  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.856132  414292 command_runner.go:130] > 3ec20f2e
	I1217 20:25:45.856705  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:25:45.864064  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.871519  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:25:45.879293  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.883196  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.883238  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.883306  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.924322  414292 command_runner.go:130] > b5213941
	I1217 20:25:45.924802  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:25:45.932259  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.939603  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:25:45.947311  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.950955  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.951320  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.951411  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.993968  414292 command_runner.go:130] > 51391683
	I1217 20:25:45.994167  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:25:46.002855  414292 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-certificates >/dev/null 2>&1 && sudo update-ca-certificates || true"
	I1217 20:25:46.007551  414292 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-trust >/dev/null 2>&1 && sudo update-ca-trust extract || true"
	I1217 20:25:46.011748  414292 ssh_runner.go:195] Run: cat /version.json
	I1217 20:25:46.011837  414292 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1217 20:25:46.016112  414292 command_runner.go:130] > {"iso_version": "v1.37.0-1765579389-22117", "kicbase_version": "v0.0.48-1765661130-22141", "minikube_version": "v1.37.0", "commit": "cbb33128a244032d08f8fc6e6c9f03b30f0da3e4"}
	I1217 20:25:46.018576  414292 ssh_runner.go:195] Run: systemctl --version
	I1217 20:25:46.126907  414292 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1217 20:25:46.127016  414292 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1217 20:25:46.127060  414292 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1217 20:25:46.127172  414292 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1217 20:25:46.131726  414292 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1217 20:25:46.131887  414292 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1217 20:25:46.131965  414292 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1217 20:25:46.140024  414292 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1217 20:25:46.140047  414292 start.go:496] detecting cgroup driver to use...
	I1217 20:25:46.140078  414292 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1217 20:25:46.140156  414292 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1217 20:25:46.155753  414292 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1217 20:25:46.168916  414292 docker.go:218] disabling cri-docker service (if available) ...
	I1217 20:25:46.169009  414292 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1217 20:25:46.184457  414292 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1217 20:25:46.197441  414292 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1217 20:25:46.302684  414292 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1217 20:25:46.421553  414292 docker.go:234] disabling docker service ...
	I1217 20:25:46.421621  414292 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1217 20:25:46.436823  414292 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1217 20:25:46.449890  414292 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1217 20:25:46.565021  414292 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1217 20:25:46.678341  414292 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1217 20:25:46.693104  414292 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1217 20:25:46.705993  414292 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1217 20:25:46.707385  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1217 20:25:46.716410  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1217 20:25:46.724756  414292 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1217 20:25:46.724876  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1217 20:25:46.733647  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:25:46.742030  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1217 20:25:46.750673  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:25:46.759312  414292 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1217 20:25:46.768595  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1217 20:25:46.777345  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1217 20:25:46.786196  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1217 20:25:46.795479  414292 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1217 20:25:46.802392  414292 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1217 20:25:46.803423  414292 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1217 20:25:46.811004  414292 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:25:46.926090  414292 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1217 20:25:47.068989  414292 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1217 20:25:47.069169  414292 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1217 20:25:47.073250  414292 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1217 20:25:47.073355  414292 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1217 20:25:47.073385  414292 command_runner.go:130] > Device: 0,72	Inode: 1618        Links: 1
	I1217 20:25:47.073441  414292 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1217 20:25:47.073470  414292 command_runner.go:130] > Access: 2025-12-17 20:25:47.016473578 +0000
	I1217 20:25:47.073512  414292 command_runner.go:130] > Modify: 2025-12-17 20:25:47.016473578 +0000
	I1217 20:25:47.073542  414292 command_runner.go:130] > Change: 2025-12-17 20:25:47.016473578 +0000
	I1217 20:25:47.073561  414292 command_runner.go:130] >  Birth: -
	I1217 20:25:47.073923  414292 start.go:564] Will wait 60s for crictl version
	I1217 20:25:47.074046  414292 ssh_runner.go:195] Run: which crictl
	I1217 20:25:47.077775  414292 command_runner.go:130] > /usr/local/bin/crictl
	I1217 20:25:47.078218  414292 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1217 20:25:47.104139  414292 command_runner.go:130] > Version:  0.1.0
	I1217 20:25:47.104225  414292 command_runner.go:130] > RuntimeName:  containerd
	I1217 20:25:47.104269  414292 command_runner.go:130] > RuntimeVersion:  v2.2.0
	I1217 20:25:47.104295  414292 command_runner.go:130] > RuntimeApiVersion:  v1
	I1217 20:25:47.106475  414292 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1217 20:25:47.106628  414292 ssh_runner.go:195] Run: containerd --version
	I1217 20:25:47.130403  414292 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1217 20:25:47.132698  414292 ssh_runner.go:195] Run: containerd --version
	I1217 20:25:47.152199  414292 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1217 20:25:47.159813  414292 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1217 20:25:47.162759  414292 cli_runner.go:164] Run: docker network inspect functional-682596 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1217 20:25:47.179237  414292 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1217 20:25:47.183476  414292 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1217 20:25:47.183701  414292 kubeadm.go:884] updating cluster {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false C
ustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1217 20:25:47.183825  414292 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:25:47.183890  414292 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:25:47.207538  414292 command_runner.go:130] > {
	I1217 20:25:47.207560  414292 command_runner.go:130] >   "images":  [
	I1217 20:25:47.207564  414292 command_runner.go:130] >     {
	I1217 20:25:47.207574  414292 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1217 20:25:47.207582  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207588  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1217 20:25:47.207591  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207595  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207607  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1217 20:25:47.207614  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207618  414292 command_runner.go:130] >       "size":  "40636774",
	I1217 20:25:47.207625  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207630  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207636  414292 command_runner.go:130] >     },
	I1217 20:25:47.207639  414292 command_runner.go:130] >     {
	I1217 20:25:47.207647  414292 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1217 20:25:47.207655  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207660  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1217 20:25:47.207664  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207668  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207678  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1217 20:25:47.207684  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207688  414292 command_runner.go:130] >       "size":  "8034419",
	I1217 20:25:47.207692  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207696  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207698  414292 command_runner.go:130] >     },
	I1217 20:25:47.207702  414292 command_runner.go:130] >     {
	I1217 20:25:47.207709  414292 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1217 20:25:47.207715  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207720  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1217 20:25:47.207735  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207747  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207756  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1217 20:25:47.207759  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207763  414292 command_runner.go:130] >       "size":  "21168808",
	I1217 20:25:47.207766  414292 command_runner.go:130] >       "username":  "nonroot",
	I1217 20:25:47.207770  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207773  414292 command_runner.go:130] >     },
	I1217 20:25:47.207776  414292 command_runner.go:130] >     {
	I1217 20:25:47.207783  414292 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1217 20:25:47.207787  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207791  414292 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1217 20:25:47.207795  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207798  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207806  414292 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1217 20:25:47.207809  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207813  414292 command_runner.go:130] >       "size":  "21749640",
	I1217 20:25:47.207817  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.207822  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.207826  414292 command_runner.go:130] >       },
	I1217 20:25:47.207833  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207837  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207842  414292 command_runner.go:130] >     },
	I1217 20:25:47.207846  414292 command_runner.go:130] >     {
	I1217 20:25:47.207853  414292 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1217 20:25:47.207859  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207865  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1217 20:25:47.207867  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207872  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207886  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1217 20:25:47.207890  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207894  414292 command_runner.go:130] >       "size":  "24692223",
	I1217 20:25:47.207897  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.207906  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.207915  414292 command_runner.go:130] >       },
	I1217 20:25:47.207928  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207932  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207934  414292 command_runner.go:130] >     },
	I1217 20:25:47.207938  414292 command_runner.go:130] >     {
	I1217 20:25:47.207947  414292 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1217 20:25:47.207955  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207961  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1217 20:25:47.207964  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207968  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207976  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1217 20:25:47.207982  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207986  414292 command_runner.go:130] >       "size":  "20672157",
	I1217 20:25:47.207990  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.207997  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.208001  414292 command_runner.go:130] >       },
	I1217 20:25:47.208020  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208028  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.208032  414292 command_runner.go:130] >     },
	I1217 20:25:47.208035  414292 command_runner.go:130] >     {
	I1217 20:25:47.208042  414292 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1217 20:25:47.208049  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.208054  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1217 20:25:47.208058  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208062  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.208069  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1217 20:25:47.208074  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208079  414292 command_runner.go:130] >       "size":  "22432301",
	I1217 20:25:47.208082  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208088  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.208091  414292 command_runner.go:130] >     },
	I1217 20:25:47.208097  414292 command_runner.go:130] >     {
	I1217 20:25:47.208104  414292 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1217 20:25:47.208114  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.208120  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1217 20:25:47.208123  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208128  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.208142  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1217 20:25:47.208146  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208149  414292 command_runner.go:130] >       "size":  "15405535",
	I1217 20:25:47.208153  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.208157  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.208163  414292 command_runner.go:130] >       },
	I1217 20:25:47.208168  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208173  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.208177  414292 command_runner.go:130] >     },
	I1217 20:25:47.208183  414292 command_runner.go:130] >     {
	I1217 20:25:47.208189  414292 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1217 20:25:47.208195  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.208200  414292 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1217 20:25:47.208203  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208207  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.208215  414292 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1217 20:25:47.208221  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208225  414292 command_runner.go:130] >       "size":  "267939",
	I1217 20:25:47.208229  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.208233  414292 command_runner.go:130] >         "value":  "65535"
	I1217 20:25:47.208237  414292 command_runner.go:130] >       },
	I1217 20:25:47.208240  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208245  414292 command_runner.go:130] >       "pinned":  true
	I1217 20:25:47.208339  414292 command_runner.go:130] >     }
	I1217 20:25:47.208342  414292 command_runner.go:130] >   ]
	I1217 20:25:47.208344  414292 command_runner.go:130] > }
	I1217 20:25:47.208525  414292 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:25:47.208539  414292 containerd.go:534] Images already preloaded, skipping extraction
	I1217 20:25:47.208601  414292 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:25:47.230634  414292 command_runner.go:130] > {
	I1217 20:25:47.230653  414292 command_runner.go:130] >   "images":  [
	I1217 20:25:47.230659  414292 command_runner.go:130] >     {
	I1217 20:25:47.230668  414292 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1217 20:25:47.230673  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230679  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1217 20:25:47.230683  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230687  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230696  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1217 20:25:47.230703  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230721  414292 command_runner.go:130] >       "size":  "40636774",
	I1217 20:25:47.230725  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.230729  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.230735  414292 command_runner.go:130] >     },
	I1217 20:25:47.230741  414292 command_runner.go:130] >     {
	I1217 20:25:47.230756  414292 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1217 20:25:47.230764  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230769  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1217 20:25:47.230773  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230786  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230798  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1217 20:25:47.230801  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230812  414292 command_runner.go:130] >       "size":  "8034419",
	I1217 20:25:47.230816  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.230819  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.230823  414292 command_runner.go:130] >     },
	I1217 20:25:47.230826  414292 command_runner.go:130] >     {
	I1217 20:25:47.230833  414292 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1217 20:25:47.230839  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230844  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1217 20:25:47.230857  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230888  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230900  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1217 20:25:47.230911  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230916  414292 command_runner.go:130] >       "size":  "21168808",
	I1217 20:25:47.230923  414292 command_runner.go:130] >       "username":  "nonroot",
	I1217 20:25:47.230927  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.230936  414292 command_runner.go:130] >     },
	I1217 20:25:47.230939  414292 command_runner.go:130] >     {
	I1217 20:25:47.230946  414292 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1217 20:25:47.230950  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230954  414292 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1217 20:25:47.230960  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230964  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230972  414292 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1217 20:25:47.230984  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230988  414292 command_runner.go:130] >       "size":  "21749640",
	I1217 20:25:47.230991  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.230995  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.230998  414292 command_runner.go:130] >       },
	I1217 20:25:47.231003  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231009  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231012  414292 command_runner.go:130] >     },
	I1217 20:25:47.231018  414292 command_runner.go:130] >     {
	I1217 20:25:47.231024  414292 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1217 20:25:47.231037  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231042  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1217 20:25:47.231045  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231050  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231063  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1217 20:25:47.231067  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231071  414292 command_runner.go:130] >       "size":  "24692223",
	I1217 20:25:47.231074  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231087  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.231093  414292 command_runner.go:130] >       },
	I1217 20:25:47.231097  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231111  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231117  414292 command_runner.go:130] >     },
	I1217 20:25:47.231125  414292 command_runner.go:130] >     {
	I1217 20:25:47.231132  414292 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1217 20:25:47.231138  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231144  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1217 20:25:47.231151  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231155  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231164  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1217 20:25:47.231168  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231172  414292 command_runner.go:130] >       "size":  "20672157",
	I1217 20:25:47.231178  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231194  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.231200  414292 command_runner.go:130] >       },
	I1217 20:25:47.231204  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231208  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231211  414292 command_runner.go:130] >     },
	I1217 20:25:47.231214  414292 command_runner.go:130] >     {
	I1217 20:25:47.231223  414292 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1217 20:25:47.231238  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231246  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1217 20:25:47.231250  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231254  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231264  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1217 20:25:47.231276  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231280  414292 command_runner.go:130] >       "size":  "22432301",
	I1217 20:25:47.231284  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231288  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231291  414292 command_runner.go:130] >     },
	I1217 20:25:47.231294  414292 command_runner.go:130] >     {
	I1217 20:25:47.231309  414292 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1217 20:25:47.231317  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231323  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1217 20:25:47.231333  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231337  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231347  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1217 20:25:47.231359  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231363  414292 command_runner.go:130] >       "size":  "15405535",
	I1217 20:25:47.231366  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231370  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.231373  414292 command_runner.go:130] >       },
	I1217 20:25:47.231379  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231392  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231395  414292 command_runner.go:130] >     },
	I1217 20:25:47.231405  414292 command_runner.go:130] >     {
	I1217 20:25:47.231412  414292 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1217 20:25:47.231418  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231423  414292 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1217 20:25:47.231428  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231437  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231445  414292 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1217 20:25:47.231448  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231452  414292 command_runner.go:130] >       "size":  "267939",
	I1217 20:25:47.231455  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231459  414292 command_runner.go:130] >         "value":  "65535"
	I1217 20:25:47.231462  414292 command_runner.go:130] >       },
	I1217 20:25:47.231466  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231469  414292 command_runner.go:130] >       "pinned":  true
	I1217 20:25:47.231473  414292 command_runner.go:130] >     }
	I1217 20:25:47.231479  414292 command_runner.go:130] >   ]
	I1217 20:25:47.231482  414292 command_runner.go:130] > }
	I1217 20:25:47.233897  414292 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:25:47.233919  414292 cache_images.go:86] Images are preloaded, skipping loading
	I1217 20:25:47.233928  414292 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1217 20:25:47.234041  414292 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-682596 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1217 20:25:47.234107  414292 ssh_runner.go:195] Run: sudo crictl info
	I1217 20:25:47.256786  414292 command_runner.go:130] > {
	I1217 20:25:47.256808  414292 command_runner.go:130] >   "cniconfig": {
	I1217 20:25:47.256814  414292 command_runner.go:130] >     "Networks": [
	I1217 20:25:47.256818  414292 command_runner.go:130] >       {
	I1217 20:25:47.256823  414292 command_runner.go:130] >         "Config": {
	I1217 20:25:47.256827  414292 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1217 20:25:47.256833  414292 command_runner.go:130] >           "Name": "cni-loopback",
	I1217 20:25:47.256837  414292 command_runner.go:130] >           "Plugins": [
	I1217 20:25:47.256840  414292 command_runner.go:130] >             {
	I1217 20:25:47.256846  414292 command_runner.go:130] >               "Network": {
	I1217 20:25:47.256851  414292 command_runner.go:130] >                 "ipam": {},
	I1217 20:25:47.256863  414292 command_runner.go:130] >                 "type": "loopback"
	I1217 20:25:47.256875  414292 command_runner.go:130] >               },
	I1217 20:25:47.256880  414292 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1217 20:25:47.256883  414292 command_runner.go:130] >             }
	I1217 20:25:47.256887  414292 command_runner.go:130] >           ],
	I1217 20:25:47.256896  414292 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1217 20:25:47.256900  414292 command_runner.go:130] >         },
	I1217 20:25:47.256911  414292 command_runner.go:130] >         "IFName": "lo"
	I1217 20:25:47.256917  414292 command_runner.go:130] >       }
	I1217 20:25:47.256920  414292 command_runner.go:130] >     ],
	I1217 20:25:47.256924  414292 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1217 20:25:47.256927  414292 command_runner.go:130] >     "PluginDirs": [
	I1217 20:25:47.256932  414292 command_runner.go:130] >       "/opt/cni/bin"
	I1217 20:25:47.256941  414292 command_runner.go:130] >     ],
	I1217 20:25:47.256945  414292 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1217 20:25:47.256949  414292 command_runner.go:130] >     "Prefix": "eth"
	I1217 20:25:47.256952  414292 command_runner.go:130] >   },
	I1217 20:25:47.256957  414292 command_runner.go:130] >   "config": {
	I1217 20:25:47.256962  414292 command_runner.go:130] >     "cdiSpecDirs": [
	I1217 20:25:47.256965  414292 command_runner.go:130] >       "/etc/cdi",
	I1217 20:25:47.256969  414292 command_runner.go:130] >       "/var/run/cdi"
	I1217 20:25:47.256977  414292 command_runner.go:130] >     ],
	I1217 20:25:47.256985  414292 command_runner.go:130] >     "cni": {
	I1217 20:25:47.256991  414292 command_runner.go:130] >       "binDir": "",
	I1217 20:25:47.256995  414292 command_runner.go:130] >       "binDirs": [
	I1217 20:25:47.256999  414292 command_runner.go:130] >         "/opt/cni/bin"
	I1217 20:25:47.257003  414292 command_runner.go:130] >       ],
	I1217 20:25:47.257008  414292 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1217 20:25:47.257025  414292 command_runner.go:130] >       "confTemplate": "",
	I1217 20:25:47.257029  414292 command_runner.go:130] >       "ipPref": "",
	I1217 20:25:47.257033  414292 command_runner.go:130] >       "maxConfNum": 1,
	I1217 20:25:47.257040  414292 command_runner.go:130] >       "setupSerially": false,
	I1217 20:25:47.257044  414292 command_runner.go:130] >       "useInternalLoopback": false
	I1217 20:25:47.257049  414292 command_runner.go:130] >     },
	I1217 20:25:47.257057  414292 command_runner.go:130] >     "containerd": {
	I1217 20:25:47.257061  414292 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1217 20:25:47.257069  414292 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1217 20:25:47.257076  414292 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1217 20:25:47.257080  414292 command_runner.go:130] >       "runtimes": {
	I1217 20:25:47.257084  414292 command_runner.go:130] >         "runc": {
	I1217 20:25:47.257097  414292 command_runner.go:130] >           "ContainerAnnotations": null,
	I1217 20:25:47.257102  414292 command_runner.go:130] >           "PodAnnotations": null,
	I1217 20:25:47.257106  414292 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1217 20:25:47.257111  414292 command_runner.go:130] >           "cgroupWritable": false,
	I1217 20:25:47.257119  414292 command_runner.go:130] >           "cniConfDir": "",
	I1217 20:25:47.257123  414292 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1217 20:25:47.257127  414292 command_runner.go:130] >           "io_type": "",
	I1217 20:25:47.257133  414292 command_runner.go:130] >           "options": {
	I1217 20:25:47.257139  414292 command_runner.go:130] >             "BinaryName": "",
	I1217 20:25:47.257143  414292 command_runner.go:130] >             "CriuImagePath": "",
	I1217 20:25:47.257148  414292 command_runner.go:130] >             "CriuWorkPath": "",
	I1217 20:25:47.257154  414292 command_runner.go:130] >             "IoGid": 0,
	I1217 20:25:47.257158  414292 command_runner.go:130] >             "IoUid": 0,
	I1217 20:25:47.257162  414292 command_runner.go:130] >             "NoNewKeyring": false,
	I1217 20:25:47.257174  414292 command_runner.go:130] >             "Root": "",
	I1217 20:25:47.257186  414292 command_runner.go:130] >             "ShimCgroup": "",
	I1217 20:25:47.257193  414292 command_runner.go:130] >             "SystemdCgroup": false
	I1217 20:25:47.257196  414292 command_runner.go:130] >           },
	I1217 20:25:47.257206  414292 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1217 20:25:47.257213  414292 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1217 20:25:47.257217  414292 command_runner.go:130] >           "runtimePath": "",
	I1217 20:25:47.257224  414292 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1217 20:25:47.257229  414292 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1217 20:25:47.257233  414292 command_runner.go:130] >           "snapshotter": ""
	I1217 20:25:47.257238  414292 command_runner.go:130] >         }
	I1217 20:25:47.257241  414292 command_runner.go:130] >       }
	I1217 20:25:47.257246  414292 command_runner.go:130] >     },
	I1217 20:25:47.257261  414292 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1217 20:25:47.257269  414292 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1217 20:25:47.257274  414292 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1217 20:25:47.257280  414292 command_runner.go:130] >     "disableApparmor": false,
	I1217 20:25:47.257290  414292 command_runner.go:130] >     "disableHugetlbController": true,
	I1217 20:25:47.257294  414292 command_runner.go:130] >     "disableProcMount": false,
	I1217 20:25:47.257299  414292 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1217 20:25:47.257303  414292 command_runner.go:130] >     "enableCDI": true,
	I1217 20:25:47.257309  414292 command_runner.go:130] >     "enableSelinux": false,
	I1217 20:25:47.257313  414292 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1217 20:25:47.257318  414292 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1217 20:25:47.257325  414292 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1217 20:25:47.257331  414292 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1217 20:25:47.257336  414292 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1217 20:25:47.257340  414292 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1217 20:25:47.257353  414292 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1217 20:25:47.257358  414292 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1217 20:25:47.257362  414292 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1217 20:25:47.257368  414292 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1217 20:25:47.257375  414292 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1217 20:25:47.257379  414292 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1217 20:25:47.257386  414292 command_runner.go:130] >   },
	I1217 20:25:47.257390  414292 command_runner.go:130] >   "features": {
	I1217 20:25:47.257396  414292 command_runner.go:130] >     "supplemental_groups_policy": true
	I1217 20:25:47.257399  414292 command_runner.go:130] >   },
	I1217 20:25:47.257403  414292 command_runner.go:130] >   "golang": "go1.24.9",
	I1217 20:25:47.257416  414292 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1217 20:25:47.257429  414292 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1217 20:25:47.257433  414292 command_runner.go:130] >   "runtimeHandlers": [
	I1217 20:25:47.257436  414292 command_runner.go:130] >     {
	I1217 20:25:47.257447  414292 command_runner.go:130] >       "features": {
	I1217 20:25:47.257451  414292 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1217 20:25:47.257455  414292 command_runner.go:130] >         "user_namespaces": true
	I1217 20:25:47.257460  414292 command_runner.go:130] >       }
	I1217 20:25:47.257463  414292 command_runner.go:130] >     },
	I1217 20:25:47.257469  414292 command_runner.go:130] >     {
	I1217 20:25:47.257473  414292 command_runner.go:130] >       "features": {
	I1217 20:25:47.257477  414292 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1217 20:25:47.257481  414292 command_runner.go:130] >         "user_namespaces": true
	I1217 20:25:47.257484  414292 command_runner.go:130] >       },
	I1217 20:25:47.257488  414292 command_runner.go:130] >       "name": "runc"
	I1217 20:25:47.257494  414292 command_runner.go:130] >     }
	I1217 20:25:47.257497  414292 command_runner.go:130] >   ],
	I1217 20:25:47.257502  414292 command_runner.go:130] >   "status": {
	I1217 20:25:47.257506  414292 command_runner.go:130] >     "conditions": [
	I1217 20:25:47.257509  414292 command_runner.go:130] >       {
	I1217 20:25:47.257514  414292 command_runner.go:130] >         "message": "",
	I1217 20:25:47.257526  414292 command_runner.go:130] >         "reason": "",
	I1217 20:25:47.257530  414292 command_runner.go:130] >         "status": true,
	I1217 20:25:47.257536  414292 command_runner.go:130] >         "type": "RuntimeReady"
	I1217 20:25:47.257539  414292 command_runner.go:130] >       },
	I1217 20:25:47.257543  414292 command_runner.go:130] >       {
	I1217 20:25:47.257549  414292 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1217 20:25:47.257554  414292 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1217 20:25:47.257563  414292 command_runner.go:130] >         "status": false,
	I1217 20:25:47.257568  414292 command_runner.go:130] >         "type": "NetworkReady"
	I1217 20:25:47.257574  414292 command_runner.go:130] >       },
	I1217 20:25:47.257577  414292 command_runner.go:130] >       {
	I1217 20:25:47.257599  414292 command_runner.go:130] >         "message": "{\"io.containerd.deprecation/cgroup-v1\":\"The support for cgroup v1 is deprecated since containerd v2.2 and will be removed by no later than May 2029. Upgrade the host to use cgroup v2.\"}",
	I1217 20:25:47.257609  414292 command_runner.go:130] >         "reason": "ContainerdHasDeprecationWarnings",
	I1217 20:25:47.257615  414292 command_runner.go:130] >         "status": false,
	I1217 20:25:47.257620  414292 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1217 20:25:47.257626  414292 command_runner.go:130] >       }
	I1217 20:25:47.257629  414292 command_runner.go:130] >     ]
	I1217 20:25:47.257631  414292 command_runner.go:130] >   }
	I1217 20:25:47.257634  414292 command_runner.go:130] > }
	I1217 20:25:47.259959  414292 cni.go:84] Creating CNI manager for ""
	I1217 20:25:47.259981  414292 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:25:47.259991  414292 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1217 20:25:47.260020  414292 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-682596 NodeName:functional-682596 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Stati
cPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1217 20:25:47.260142  414292 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-682596"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1217 20:25:47.260216  414292 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1217 20:25:47.267498  414292 command_runner.go:130] > kubeadm
	I1217 20:25:47.267517  414292 command_runner.go:130] > kubectl
	I1217 20:25:47.267520  414292 command_runner.go:130] > kubelet
	I1217 20:25:47.268462  414292 binaries.go:51] Found k8s binaries, skipping transfer
	I1217 20:25:47.268563  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1217 20:25:47.276438  414292 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1217 20:25:47.289778  414292 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1217 20:25:47.303155  414292 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2235 bytes)
	I1217 20:25:47.315864  414292 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1217 20:25:47.319319  414292 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1217 20:25:47.319605  414292 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:25:47.441462  414292 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1217 20:25:47.463080  414292 certs.go:69] Setting up /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596 for IP: 192.168.49.2
	I1217 20:25:47.463150  414292 certs.go:195] generating shared ca certs ...
	I1217 20:25:47.463190  414292 certs.go:227] acquiring lock for ca certs: {Name:mk528c7ee25f2f3d78de33f266a77f908cb2a9d0 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:47.463362  414292 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key
	I1217 20:25:47.463461  414292 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key
	I1217 20:25:47.463501  414292 certs.go:257] generating profile certs ...
	I1217 20:25:47.463662  414292 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key
	I1217 20:25:47.463774  414292 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key.0c30bf8d
	I1217 20:25:47.463860  414292 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key
	I1217 20:25:47.463894  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1217 20:25:47.463938  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1217 20:25:47.463977  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1217 20:25:47.464005  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1217 20:25:47.464049  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1217 20:25:47.464079  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1217 20:25:47.464117  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1217 20:25:47.464151  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1217 20:25:47.464241  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:25:47.464342  414292 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:25:47.464377  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:25:47.464421  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:25:47.464488  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:25:47.464541  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:25:47.464629  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:25:47.464693  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.464733  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.464771  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem -> /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.469220  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1217 20:25:47.495389  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1217 20:25:47.516308  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1217 20:25:47.535144  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1217 20:25:47.552466  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1217 20:25:47.570909  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1217 20:25:47.588173  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1217 20:25:47.606011  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1217 20:25:47.623433  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:25:47.640520  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:25:47.657751  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:25:47.675695  414292 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1217 20:25:47.688487  414292 ssh_runner.go:195] Run: openssl version
	I1217 20:25:47.694560  414292 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1217 20:25:47.694946  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.702368  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:25:47.710124  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.713826  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.713858  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.713917  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.754917  414292 command_runner.go:130] > 3ec20f2e
	I1217 20:25:47.755445  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:25:47.763008  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.770327  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:25:47.778030  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.782014  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.782042  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.782099  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.822920  414292 command_runner.go:130] > b5213941
	I1217 20:25:47.823058  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:25:47.830582  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.837906  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:25:47.845640  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.849463  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.849531  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.849600  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.890040  414292 command_runner.go:130] > 51391683
	I1217 20:25:47.890555  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:25:47.898150  414292 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1217 20:25:47.901790  414292 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1217 20:25:47.901872  414292 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1217 20:25:47.901887  414292 command_runner.go:130] > Device: 259,1	Inode: 1060771     Links: 1
	I1217 20:25:47.901895  414292 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1217 20:25:47.901902  414292 command_runner.go:130] > Access: 2025-12-17 20:21:41.033930957 +0000
	I1217 20:25:47.901907  414292 command_runner.go:130] > Modify: 2025-12-17 20:17:35.731490416 +0000
	I1217 20:25:47.901912  414292 command_runner.go:130] > Change: 2025-12-17 20:17:35.731490416 +0000
	I1217 20:25:47.901921  414292 command_runner.go:130] >  Birth: 2025-12-17 20:17:35.731490416 +0000
	I1217 20:25:47.901988  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1217 20:25:47.942293  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:47.942780  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1217 20:25:47.983019  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:47.983513  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1217 20:25:48.024341  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.024837  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1217 20:25:48.065771  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.066190  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1217 20:25:48.107223  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.107692  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1217 20:25:48.148374  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.148810  414292 kubeadm.go:401] StartCluster: {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Cust
omQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:25:48.148912  414292 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1217 20:25:48.148983  414292 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1217 20:25:48.175983  414292 cri.go:89] found id: ""
	I1217 20:25:48.176056  414292 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1217 20:25:48.182939  414292 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1217 20:25:48.182960  414292 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1217 20:25:48.182967  414292 command_runner.go:130] > /var/lib/minikube/etcd:
	I1217 20:25:48.183854  414292 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1217 20:25:48.183910  414292 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1217 20:25:48.183977  414292 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1217 20:25:48.191197  414292 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:25:48.191635  414292 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-682596" does not appear in /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.191740  414292 kubeconfig.go:62] /home/jenkins/minikube-integration/21808-367595/kubeconfig needs updating (will repair): [kubeconfig missing "functional-682596" cluster setting kubeconfig missing "functional-682596" context setting]
	I1217 20:25:48.192034  414292 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/kubeconfig: {Name:mk68b516071fc5d9da0842bf56ff4d318cea3c03 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:48.192565  414292 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.192744  414292 kapi.go:59] client config for functional-682596: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt", KeyFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key", CAFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil),
NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1217 20:25:48.193250  414292 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1217 20:25:48.193273  414292 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1217 20:25:48.193281  414292 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1217 20:25:48.193286  414292 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1217 20:25:48.193293  414292 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1217 20:25:48.193576  414292 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1217 20:25:48.193650  414292 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1217 20:25:48.201269  414292 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1217 20:25:48.201338  414292 kubeadm.go:602] duration metric: took 17.417602ms to restartPrimaryControlPlane
	I1217 20:25:48.201355  414292 kubeadm.go:403] duration metric: took 52.552362ms to StartCluster
	I1217 20:25:48.201370  414292 settings.go:142] acquiring lock: {Name:mkec67bf414aabef990098a6cc4910956f0d3622 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:48.201429  414292 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.202007  414292 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/kubeconfig: {Name:mk68b516071fc5d9da0842bf56ff4d318cea3c03 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:48.202208  414292 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1217 20:25:48.202539  414292 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:25:48.202581  414292 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1217 20:25:48.202699  414292 addons.go:70] Setting storage-provisioner=true in profile "functional-682596"
	I1217 20:25:48.202717  414292 addons.go:239] Setting addon storage-provisioner=true in "functional-682596"
	I1217 20:25:48.202742  414292 host.go:66] Checking if "functional-682596" exists ...
	I1217 20:25:48.202770  414292 addons.go:70] Setting default-storageclass=true in profile "functional-682596"
	I1217 20:25:48.202806  414292 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-682596"
	I1217 20:25:48.203165  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:48.203224  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:48.208687  414292 out.go:179] * Verifying Kubernetes components...
	I1217 20:25:48.211692  414292 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:25:48.230383  414292 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1217 20:25:48.233339  414292 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:48.233361  414292 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1217 20:25:48.233423  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:48.236813  414292 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.236975  414292 kapi.go:59] client config for functional-682596: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt", KeyFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key", CAFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil),
NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1217 20:25:48.237238  414292 addons.go:239] Setting addon default-storageclass=true in "functional-682596"
	I1217 20:25:48.237267  414292 host.go:66] Checking if "functional-682596" exists ...
	I1217 20:25:48.237711  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:48.262897  414292 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:48.262919  414292 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1217 20:25:48.262996  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:48.269972  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:48.294767  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:48.418586  414292 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1217 20:25:48.450623  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:48.465245  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:49.239916  414292 node_ready.go:35] waiting up to 6m0s for node "functional-682596" to be "Ready" ...
	I1217 20:25:49.240030  414292 type.go:168] "Request Body" body=""
	I1217 20:25:49.240095  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:49.240342  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.240376  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240403  414292 retry.go:31] will retry after 252.350229ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240440  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.240459  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240479  414292 retry.go:31] will retry after 321.821783ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240555  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:49.493033  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:49.547929  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.551638  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.551667  414292 retry.go:31] will retry after 328.531722ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.562869  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:49.621023  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.625124  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.625209  414292 retry.go:31] will retry after 442.103425ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.740481  414292 type.go:168] "Request Body" body=""
	I1217 20:25:49.740559  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:49.740872  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:49.881274  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:49.942102  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.945784  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.945890  414292 retry.go:31] will retry after 409.243705ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.068055  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:50.127397  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:50.131721  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.131759  414292 retry.go:31] will retry after 566.560423ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.241000  414292 type.go:168] "Request Body" body=""
	I1217 20:25:50.241077  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:50.241406  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:50.355732  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:50.414970  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:50.419857  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.419893  414292 retry.go:31] will retry after 763.212709ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.699479  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:50.741041  414292 type.go:168] "Request Body" body=""
	I1217 20:25:50.741134  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:50.741465  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:50.776772  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:50.776815  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.776839  414292 retry.go:31] will retry after 1.24877806s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:51.183473  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:51.240182  414292 type.go:168] "Request Body" body=""
	I1217 20:25:51.240277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:51.240545  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:51.240594  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:51.251909  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:51.255943  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:51.255983  414292 retry.go:31] will retry after 1.271740821s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:51.740532  414292 type.go:168] "Request Body" body=""
	I1217 20:25:51.740649  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:51.740974  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:52.026483  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:52.095052  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:52.095119  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.095140  414292 retry.go:31] will retry after 1.58694383s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.240356  414292 type.go:168] "Request Body" body=""
	I1217 20:25:52.240430  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:52.240682  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:52.528382  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:52.586445  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:52.590032  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.590066  414292 retry.go:31] will retry after 1.445188932s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.740386  414292 type.go:168] "Request Body" body=""
	I1217 20:25:52.740463  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:52.740818  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:53.240182  414292 type.go:168] "Request Body" body=""
	I1217 20:25:53.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:53.240604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:53.240660  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:53.682297  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:53.740043  414292 type.go:168] "Request Body" body=""
	I1217 20:25:53.740108  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:53.740352  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:53.743851  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:53.743882  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:53.743900  414292 retry.go:31] will retry after 2.69671946s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:54.036496  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:54.096053  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:54.096099  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:54.096122  414292 retry.go:31] will retry after 2.925706415s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:54.240487  414292 type.go:168] "Request Body" body=""
	I1217 20:25:54.240571  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:54.240903  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:54.740656  414292 type.go:168] "Request Body" body=""
	I1217 20:25:54.740752  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:54.741104  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:55.240849  414292 type.go:168] "Request Body" body=""
	I1217 20:25:55.240918  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:55.241169  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:55.241222  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:55.741059  414292 type.go:168] "Request Body" body=""
	I1217 20:25:55.741137  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:55.741444  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:56.240196  414292 type.go:168] "Request Body" body=""
	I1217 20:25:56.240318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:56.240645  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:56.440979  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:56.500702  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:56.500749  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:56.500767  414292 retry.go:31] will retry after 1.84810195s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:56.740117  414292 type.go:168] "Request Body" body=""
	I1217 20:25:56.740201  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:56.740503  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:57.023057  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:57.082954  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:57.083001  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:57.083020  414292 retry.go:31] will retry after 3.223759279s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:57.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:25:57.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:57.240558  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:57.740268  414292 type.go:168] "Request Body" body=""
	I1217 20:25:57.740347  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:57.740685  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:57.740756  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:58.240571  414292 type.go:168] "Request Body" body=""
	I1217 20:25:58.240660  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:58.240952  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:58.349268  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:58.403710  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:58.407286  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:58.407317  414292 retry.go:31] will retry after 3.305771044s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:58.740858  414292 type.go:168] "Request Body" body=""
	I1217 20:25:58.740936  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:58.741275  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:59.240111  414292 type.go:168] "Request Body" body=""
	I1217 20:25:59.240220  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:59.240560  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:59.740145  414292 type.go:168] "Request Body" body=""
	I1217 20:25:59.740223  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:59.740492  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:00.240307  414292 type.go:168] "Request Body" body=""
	I1217 20:26:00.240425  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:00.240806  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:00.240857  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:00.307216  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:00.372358  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:00.376526  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:00.376564  414292 retry.go:31] will retry after 8.003704403s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:00.740135  414292 type.go:168] "Request Body" body=""
	I1217 20:26:00.740216  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:00.740543  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:01.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:26:01.240281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:01.240535  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:01.713237  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:01.740945  414292 type.go:168] "Request Body" body=""
	I1217 20:26:01.741019  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:01.741278  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:01.769053  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:01.772711  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:01.772742  414292 retry.go:31] will retry after 3.267552643s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:02.240198  414292 type.go:168] "Request Body" body=""
	I1217 20:26:02.240302  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:02.240604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:02.740266  414292 type.go:168] "Request Body" body=""
	I1217 20:26:02.740336  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:02.740681  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:02.740769  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:03.240210  414292 type.go:168] "Request Body" body=""
	I1217 20:26:03.240299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:03.240622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:03.740228  414292 type.go:168] "Request Body" body=""
	I1217 20:26:03.740320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:03.740637  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:04.240516  414292 type.go:168] "Request Body" body=""
	I1217 20:26:04.240588  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:04.240943  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:04.740734  414292 type.go:168] "Request Body" body=""
	I1217 20:26:04.740811  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:04.741190  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:04.741246  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:05.040756  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:05.102503  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:05.102552  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:05.102572  414292 retry.go:31] will retry after 12.344413157s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:05.240841  414292 type.go:168] "Request Body" body=""
	I1217 20:26:05.240913  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:05.241244  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:05.740855  414292 type.go:168] "Request Body" body=""
	I1217 20:26:05.740930  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:05.741188  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:06.241036  414292 type.go:168] "Request Body" body=""
	I1217 20:26:06.241119  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:06.241411  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:06.740129  414292 type.go:168] "Request Body" body=""
	I1217 20:26:06.740212  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:06.740571  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:07.240279  414292 type.go:168] "Request Body" body=""
	I1217 20:26:07.240353  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:07.240608  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:07.240657  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:07.740195  414292 type.go:168] "Request Body" body=""
	I1217 20:26:07.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:07.740591  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:08.240525  414292 type.go:168] "Request Body" body=""
	I1217 20:26:08.240599  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:08.240914  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:08.381383  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:08.435212  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:08.439369  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:08.439410  414292 retry.go:31] will retry after 8.892819822s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:08.740968  414292 type.go:168] "Request Body" body=""
	I1217 20:26:08.741065  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:08.741390  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:09.240148  414292 type.go:168] "Request Body" body=""
	I1217 20:26:09.240230  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:09.240616  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:09.740331  414292 type.go:168] "Request Body" body=""
	I1217 20:26:09.740408  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:09.740742  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:09.740801  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:10.240362  414292 type.go:168] "Request Body" body=""
	I1217 20:26:10.240435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:10.240780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:10.740200  414292 type.go:168] "Request Body" body=""
	I1217 20:26:10.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:10.740646  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:11.240216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:11.240308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:11.240651  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:11.740164  414292 type.go:168] "Request Body" body=""
	I1217 20:26:11.740235  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:11.740510  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:12.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:26:12.240283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:12.240625  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:12.240683  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:12.740202  414292 type.go:168] "Request Body" body=""
	I1217 20:26:12.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:12.740622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:13.240171  414292 type.go:168] "Request Body" body=""
	I1217 20:26:13.240266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:13.240526  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:13.740189  414292 type.go:168] "Request Body" body=""
	I1217 20:26:13.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:13.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:14.240635  414292 type.go:168] "Request Body" body=""
	I1217 20:26:14.240715  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:14.241059  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:14.241125  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:14.741075  414292 type.go:168] "Request Body" body=""
	I1217 20:26:14.741149  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:14.741406  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:15.240079  414292 type.go:168] "Request Body" body=""
	I1217 20:26:15.240155  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:15.240494  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:15.740230  414292 type.go:168] "Request Body" body=""
	I1217 20:26:15.740334  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:15.740683  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:16.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:26:16.240429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:16.240695  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:16.740381  414292 type.go:168] "Request Body" body=""
	I1217 20:26:16.740456  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:16.740780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:16.740834  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:17.240356  414292 type.go:168] "Request Body" body=""
	I1217 20:26:17.240434  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:17.240777  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:17.333063  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:17.388410  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:17.391967  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.391995  414292 retry.go:31] will retry after 13.113728844s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.447345  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:17.505124  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:17.505163  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.505182  414292 retry.go:31] will retry after 11.452403849s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.740560  414292 type.go:168] "Request Body" body=""
	I1217 20:26:17.740629  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:17.740885  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:18.240553  414292 type.go:168] "Request Body" body=""
	I1217 20:26:18.240633  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:18.240967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:18.740512  414292 type.go:168] "Request Body" body=""
	I1217 20:26:18.740589  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:18.740904  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:18.740955  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:19.240885  414292 type.go:168] "Request Body" body=""
	I1217 20:26:19.240962  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:19.241213  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:19.741015  414292 type.go:168] "Request Body" body=""
	I1217 20:26:19.741087  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:19.741404  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:20.240182  414292 type.go:168] "Request Body" body=""
	I1217 20:26:20.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:20.240627  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:20.740108  414292 type.go:168] "Request Body" body=""
	I1217 20:26:20.740183  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:20.740453  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:21.240216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:21.240318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:21.240698  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:21.240754  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:21.740194  414292 type.go:168] "Request Body" body=""
	I1217 20:26:21.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:21.740628  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:22.240933  414292 type.go:168] "Request Body" body=""
	I1217 20:26:22.241005  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:22.241257  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:22.741110  414292 type.go:168] "Request Body" body=""
	I1217 20:26:22.741224  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:22.741585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:23.240288  414292 type.go:168] "Request Body" body=""
	I1217 20:26:23.240369  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:23.240662  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:23.741096  414292 type.go:168] "Request Body" body=""
	I1217 20:26:23.741162  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:23.741447  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:23.741492  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:24.240108  414292 type.go:168] "Request Body" body=""
	I1217 20:26:24.240184  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:24.240503  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:24.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:24.740312  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:24.740605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:25.240111  414292 type.go:168] "Request Body" body=""
	I1217 20:26:25.240206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:25.240472  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:25.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:25.740317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:25.740610  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:26.240350  414292 type.go:168] "Request Body" body=""
	I1217 20:26:26.240423  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:26.240796  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:26.240856  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:26.740354  414292 type.go:168] "Request Body" body=""
	I1217 20:26:26.740433  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:26.740693  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:27.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:26:27.240285  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:27.240571  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:27.740306  414292 type.go:168] "Request Body" body=""
	I1217 20:26:27.740387  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:27.740718  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:28.240518  414292 type.go:168] "Request Body" body=""
	I1217 20:26:28.240588  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:28.240860  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:28.240905  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:28.740699  414292 type.go:168] "Request Body" body=""
	I1217 20:26:28.740776  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:28.741110  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:28.958534  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:29.018842  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:29.024509  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:29.024543  414292 retry.go:31] will retry after 28.006345092s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:29.241080  414292 type.go:168] "Request Body" body=""
	I1217 20:26:29.241159  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:29.241493  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:29.740997  414292 type.go:168] "Request Body" body=""
	I1217 20:26:29.741065  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:29.741356  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:30.241045  414292 type.go:168] "Request Body" body=""
	I1217 20:26:30.241120  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:30.241435  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:30.241493  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:30.505938  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:30.574101  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:30.574147  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:30.574166  414292 retry.go:31] will retry after 31.982210322s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:30.740490  414292 type.go:168] "Request Body" body=""
	I1217 20:26:30.740579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:30.740933  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:31.240692  414292 type.go:168] "Request Body" body=""
	I1217 20:26:31.240768  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:31.248432  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=7
	I1217 20:26:31.740179  414292 type.go:168] "Request Body" body=""
	I1217 20:26:31.740287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:31.740647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:32.240201  414292 type.go:168] "Request Body" body=""
	I1217 20:26:32.240304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:32.240630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:32.740084  414292 type.go:168] "Request Body" body=""
	I1217 20:26:32.740156  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:32.740461  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:32.740520  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:33.240200  414292 type.go:168] "Request Body" body=""
	I1217 20:26:33.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:33.240625  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:33.740223  414292 type.go:168] "Request Body" body=""
	I1217 20:26:33.740319  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:33.740635  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:34.240634  414292 type.go:168] "Request Body" body=""
	I1217 20:26:34.240711  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:34.241019  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:34.740715  414292 type.go:168] "Request Body" body=""
	I1217 20:26:34.740788  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:34.741122  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:34.741178  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:35.240958  414292 type.go:168] "Request Body" body=""
	I1217 20:26:35.241039  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:35.241368  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:35.740050  414292 type.go:168] "Request Body" body=""
	I1217 20:26:35.740126  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:35.740407  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:36.240170  414292 type.go:168] "Request Body" body=""
	I1217 20:26:36.240271  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:36.240623  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:36.740177  414292 type.go:168] "Request Body" body=""
	I1217 20:26:36.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:36.740609  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:37.240114  414292 type.go:168] "Request Body" body=""
	I1217 20:26:37.240213  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:37.240481  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:37.240522  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:37.740165  414292 type.go:168] "Request Body" body=""
	I1217 20:26:37.740239  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:37.740593  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:38.240448  414292 type.go:168] "Request Body" body=""
	I1217 20:26:38.240537  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:38.240850  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:38.740388  414292 type.go:168] "Request Body" body=""
	I1217 20:26:38.740461  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:38.740786  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:39.240612  414292 type.go:168] "Request Body" body=""
	I1217 20:26:39.240699  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:39.241070  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:39.241129  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:39.740905  414292 type.go:168] "Request Body" body=""
	I1217 20:26:39.740985  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:39.741321  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:40.240050  414292 type.go:168] "Request Body" body=""
	I1217 20:26:40.240123  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:40.240466  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:40.740172  414292 type.go:168] "Request Body" body=""
	I1217 20:26:40.740275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:40.740632  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:41.240336  414292 type.go:168] "Request Body" body=""
	I1217 20:26:41.240409  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:41.240744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:41.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:26:41.740423  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:41.740730  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:41.740787  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:42.240202  414292 type.go:168] "Request Body" body=""
	I1217 20:26:42.240311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:42.240673  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:42.740189  414292 type.go:168] "Request Body" body=""
	I1217 20:26:42.740284  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:42.740581  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:43.240122  414292 type.go:168] "Request Body" body=""
	I1217 20:26:43.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:43.240458  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:43.740180  414292 type.go:168] "Request Body" body=""
	I1217 20:26:43.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:43.740597  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:44.240188  414292 type.go:168] "Request Body" body=""
	I1217 20:26:44.240284  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:44.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:44.240672  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:44.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:26:44.740427  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:44.740701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:45.240268  414292 type.go:168] "Request Body" body=""
	I1217 20:26:45.240370  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:45.240810  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:45.740385  414292 type.go:168] "Request Body" body=""
	I1217 20:26:45.740481  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:45.740887  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:46.240669  414292 type.go:168] "Request Body" body=""
	I1217 20:26:46.240750  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:46.241005  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:46.241046  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:46.740832  414292 type.go:168] "Request Body" body=""
	I1217 20:26:46.740907  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:46.741230  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:47.241112  414292 type.go:168] "Request Body" body=""
	I1217 20:26:47.241195  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:47.241535  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:47.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:47.740311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:47.740564  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:48.240565  414292 type.go:168] "Request Body" body=""
	I1217 20:26:48.240648  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:48.240997  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:48.740812  414292 type.go:168] "Request Body" body=""
	I1217 20:26:48.740893  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:48.741250  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:48.741305  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:49.241092  414292 type.go:168] "Request Body" body=""
	I1217 20:26:49.241159  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:49.241410  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:49.740094  414292 type.go:168] "Request Body" body=""
	I1217 20:26:49.740170  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:49.740483  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:50.240219  414292 type.go:168] "Request Body" body=""
	I1217 20:26:50.240334  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:50.240696  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:50.740130  414292 type.go:168] "Request Body" body=""
	I1217 20:26:50.740210  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:50.740538  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:51.240180  414292 type.go:168] "Request Body" body=""
	I1217 20:26:51.240279  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:51.240607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:51.240658  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:51.740209  414292 type.go:168] "Request Body" body=""
	I1217 20:26:51.740323  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:51.740662  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:52.240355  414292 type.go:168] "Request Body" body=""
	I1217 20:26:52.240429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:52.240693  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:52.740381  414292 type.go:168] "Request Body" body=""
	I1217 20:26:52.740464  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:52.740824  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:53.240545  414292 type.go:168] "Request Body" body=""
	I1217 20:26:53.240622  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:53.240967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:53.241022  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:53.740774  414292 type.go:168] "Request Body" body=""
	I1217 20:26:53.740855  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:53.741192  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:54.240965  414292 type.go:168] "Request Body" body=""
	I1217 20:26:54.241045  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:54.241396  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:54.740180  414292 type.go:168] "Request Body" body=""
	I1217 20:26:54.740269  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:54.740570  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:55.240136  414292 type.go:168] "Request Body" body=""
	I1217 20:26:55.240208  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:55.240531  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:55.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:55.740305  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:55.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:55.740689  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:56.240227  414292 type.go:168] "Request Body" body=""
	I1217 20:26:56.240326  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:56.240664  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:56.740128  414292 type.go:168] "Request Body" body=""
	I1217 20:26:56.740207  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:56.740534  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:57.031083  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:57.091368  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:57.091412  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:57.091434  414292 retry.go:31] will retry after 46.71155063s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:57.240719  414292 type.go:168] "Request Body" body=""
	I1217 20:26:57.240799  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:57.241113  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:57.740782  414292 type.go:168] "Request Body" body=""
	I1217 20:26:57.740862  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:57.741143  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:57.741191  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:58.240610  414292 type.go:168] "Request Body" body=""
	I1217 20:26:58.240678  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:58.240925  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:58.740701  414292 type.go:168] "Request Body" body=""
	I1217 20:26:58.740774  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:58.741126  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:59.241090  414292 type.go:168] "Request Body" body=""
	I1217 20:26:59.241163  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:59.241466  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:59.740862  414292 type.go:168] "Request Body" body=""
	I1217 20:26:59.740930  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:59.741174  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:59.741215  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:00.241177  414292 type.go:168] "Request Body" body=""
	I1217 20:27:00.241266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:00.241643  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:00.740463  414292 type.go:168] "Request Body" body=""
	I1217 20:27:00.740543  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:00.740888  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:01.240688  414292 type.go:168] "Request Body" body=""
	I1217 20:27:01.240764  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:01.241063  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:01.740881  414292 type.go:168] "Request Body" body=""
	I1217 20:27:01.740989  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:01.741337  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:01.741388  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:02.240100  414292 type.go:168] "Request Body" body=""
	I1217 20:27:02.240176  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:02.240556  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:02.557038  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:27:02.616976  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:02.620493  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:27:02.620531  414292 retry.go:31] will retry after 42.622456402s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:27:02.740802  414292 type.go:168] "Request Body" body=""
	I1217 20:27:02.740875  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:02.741140  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:03.240977  414292 type.go:168] "Request Body" body=""
	I1217 20:27:03.241074  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:03.241392  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:03.740139  414292 type.go:168] "Request Body" body=""
	I1217 20:27:03.740236  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:03.740586  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:04.240156  414292 type.go:168] "Request Body" body=""
	I1217 20:27:04.240238  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:04.240536  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:04.240579  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:04.740272  414292 type.go:168] "Request Body" body=""
	I1217 20:27:04.740346  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:04.740738  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:05.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:27:05.240287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:05.240617  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:05.740277  414292 type.go:168] "Request Body" body=""
	I1217 20:27:05.740351  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:05.740613  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:06.240183  414292 type.go:168] "Request Body" body=""
	I1217 20:27:06.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:06.240615  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:06.240675  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:06.740224  414292 type.go:168] "Request Body" body=""
	I1217 20:27:06.740355  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:06.740779  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:07.240397  414292 type.go:168] "Request Body" body=""
	I1217 20:27:07.240474  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:07.240744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:07.740218  414292 type.go:168] "Request Body" body=""
	I1217 20:27:07.740311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:07.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:08.240435  414292 type.go:168] "Request Body" body=""
	I1217 20:27:08.240511  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:08.240866  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:08.240920  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:08.740675  414292 type.go:168] "Request Body" body=""
	I1217 20:27:08.740748  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:08.741014  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:09.241042  414292 type.go:168] "Request Body" body=""
	I1217 20:27:09.241128  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:09.241481  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:09.740097  414292 type.go:168] "Request Body" body=""
	I1217 20:27:09.740192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:09.740525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:10.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:27:10.240295  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:10.240559  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:10.740356  414292 type.go:168] "Request Body" body=""
	I1217 20:27:10.740433  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:10.740782  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:10.740855  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:11.240590  414292 type.go:168] "Request Body" body=""
	I1217 20:27:11.240674  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:11.241071  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:11.740833  414292 type.go:168] "Request Body" body=""
	I1217 20:27:11.740915  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:11.741195  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:12.241068  414292 type.go:168] "Request Body" body=""
	I1217 20:27:12.241147  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:12.241476  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:12.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:27:12.740301  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:12.740663  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:13.240124  414292 type.go:168] "Request Body" body=""
	I1217 20:27:13.240197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:13.240538  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:13.240601  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:13.740193  414292 type.go:168] "Request Body" body=""
	I1217 20:27:13.740291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:13.740596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:14.240723  414292 type.go:168] "Request Body" body=""
	I1217 20:27:14.240797  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:14.241150  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:14.740979  414292 type.go:168] "Request Body" body=""
	I1217 20:27:14.741059  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:14.741325  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:15.240063  414292 type.go:168] "Request Body" body=""
	I1217 20:27:15.240146  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:15.240479  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:15.740243  414292 type.go:168] "Request Body" body=""
	I1217 20:27:15.740346  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:15.740681  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:15.740748  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:16.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:27:16.240421  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:16.240686  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:16.740158  414292 type.go:168] "Request Body" body=""
	I1217 20:27:16.740236  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:16.740588  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:17.240299  414292 type.go:168] "Request Body" body=""
	I1217 20:27:17.240374  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:17.240705  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:17.740352  414292 type.go:168] "Request Body" body=""
	I1217 20:27:17.740427  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:17.740687  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:18.240626  414292 type.go:168] "Request Body" body=""
	I1217 20:27:18.240717  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:18.241052  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:18.241112  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:18.740882  414292 type.go:168] "Request Body" body=""
	I1217 20:27:18.740963  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:18.741275  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:19.240070  414292 type.go:168] "Request Body" body=""
	I1217 20:27:19.240154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:19.240424  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:19.740179  414292 type.go:168] "Request Body" body=""
	I1217 20:27:19.740319  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:19.740655  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:20.240358  414292 type.go:168] "Request Body" body=""
	I1217 20:27:20.240437  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:20.240772  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:20.740370  414292 type.go:168] "Request Body" body=""
	I1217 20:27:20.740436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:20.740701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:20.740740  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:21.240460  414292 type.go:168] "Request Body" body=""
	I1217 20:27:21.240550  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:21.240867  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:21.740170  414292 type.go:168] "Request Body" body=""
	I1217 20:27:21.740290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:21.740607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:22.240286  414292 type.go:168] "Request Body" body=""
	I1217 20:27:22.240355  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:22.240607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:22.740330  414292 type.go:168] "Request Body" body=""
	I1217 20:27:22.740422  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:22.740746  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:22.740812  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:23.240211  414292 type.go:168] "Request Body" body=""
	I1217 20:27:23.240304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:23.240668  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:23.740210  414292 type.go:168] "Request Body" body=""
	I1217 20:27:23.740305  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:23.740601  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:24.240106  414292 type.go:168] "Request Body" body=""
	I1217 20:27:24.240203  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:24.240577  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:24.740327  414292 type.go:168] "Request Body" body=""
	I1217 20:27:24.740411  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:24.740719  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:25.240377  414292 type.go:168] "Request Body" body=""
	I1217 20:27:25.240459  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:25.240721  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:25.240761  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:25.740193  414292 type.go:168] "Request Body" body=""
	I1217 20:27:25.740294  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:25.740646  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:26.240192  414292 type.go:168] "Request Body" body=""
	I1217 20:27:26.240307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:26.240648  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:26.741005  414292 type.go:168] "Request Body" body=""
	I1217 20:27:26.741084  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:26.741343  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:27.241172  414292 type.go:168] "Request Body" body=""
	I1217 20:27:27.241259  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:27.241612  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:27.241675  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:27.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:27:27.740292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:27.740610  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:28.240549  414292 type.go:168] "Request Body" body=""
	I1217 20:27:28.240616  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:28.240862  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:28.740178  414292 type.go:168] "Request Body" body=""
	I1217 20:27:28.740278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:28.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:29.240202  414292 type.go:168] "Request Body" body=""
	I1217 20:27:29.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:29.240606  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:29.740117  414292 type.go:168] "Request Body" body=""
	I1217 20:27:29.740197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:29.740469  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:29.740520  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:30.240204  414292 type.go:168] "Request Body" body=""
	I1217 20:27:30.240310  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:30.240594  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:30.740299  414292 type.go:168] "Request Body" body=""
	I1217 20:27:30.740381  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:30.740724  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:31.240991  414292 type.go:168] "Request Body" body=""
	I1217 20:27:31.241062  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:31.241311  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:31.741040  414292 type.go:168] "Request Body" body=""
	I1217 20:27:31.741119  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:31.741431  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:31.741486  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:32.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:27:32.240308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:32.240623  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:32.740134  414292 type.go:168] "Request Body" body=""
	I1217 20:27:32.740210  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:32.740528  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:33.240226  414292 type.go:168] "Request Body" body=""
	I1217 20:27:33.240327  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:33.240647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:33.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:27:33.740286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:33.740611  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:34.240530  414292 type.go:168] "Request Body" body=""
	I1217 20:27:34.240599  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:34.240875  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:34.240919  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:34.740753  414292 type.go:168] "Request Body" body=""
	I1217 20:27:34.740829  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:34.741167  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:35.240992  414292 type.go:168] "Request Body" body=""
	I1217 20:27:35.241075  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:35.241368  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:35.740071  414292 type.go:168] "Request Body" body=""
	I1217 20:27:35.740149  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:35.740453  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:36.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:27:36.240282  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:36.240595  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:36.740308  414292 type.go:168] "Request Body" body=""
	I1217 20:27:36.740384  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:36.740734  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:36.740791  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:37.240354  414292 type.go:168] "Request Body" body=""
	I1217 20:27:37.240426  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:37.240679  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:37.740203  414292 type.go:168] "Request Body" body=""
	I1217 20:27:37.740290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:37.740605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:38.240504  414292 type.go:168] "Request Body" body=""
	I1217 20:27:38.240579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:38.240898  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:38.740390  414292 type.go:168] "Request Body" body=""
	I1217 20:27:38.740487  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:38.740891  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:38.740956  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:39.240903  414292 type.go:168] "Request Body" body=""
	I1217 20:27:39.240984  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:39.241256  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:39.741008  414292 type.go:168] "Request Body" body=""
	I1217 20:27:39.741080  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:39.741404  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:40.241008  414292 type.go:168] "Request Body" body=""
	I1217 20:27:40.241078  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:40.241381  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:40.740111  414292 type.go:168] "Request Body" body=""
	I1217 20:27:40.740212  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:40.740522  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:41.240166  414292 type.go:168] "Request Body" body=""
	I1217 20:27:41.240266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:41.240622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:41.240677  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:41.740082  414292 type.go:168] "Request Body" body=""
	I1217 20:27:41.740154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:41.740465  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:42.240219  414292 type.go:168] "Request Body" body=""
	I1217 20:27:42.240342  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:42.240648  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:42.740376  414292 type.go:168] "Request Body" body=""
	I1217 20:27:42.740469  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:42.740797  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:43.240389  414292 type.go:168] "Request Body" body=""
	I1217 20:27:43.240470  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:43.240795  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:43.240842  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:43.740348  414292 type.go:168] "Request Body" body=""
	I1217 20:27:43.740422  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:43.740975  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:43.803299  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:27:43.859759  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:43.863204  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:43.863297  414292 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1217 20:27:44.241025  414292 type.go:168] "Request Body" body=""
	I1217 20:27:44.241121  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:44.241455  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:44.740148  414292 type.go:168] "Request Body" body=""
	I1217 20:27:44.740221  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:44.740492  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:45.240574  414292 type.go:168] "Request Body" body=""
	I1217 20:27:45.240742  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:45.242019  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=1
	W1217 20:27:45.242186  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:45.244153  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:27:45.319007  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:45.319122  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:45.319226  414292 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1217 20:27:45.322350  414292 out.go:179] * Enabled addons: 
	I1217 20:27:45.325857  414292 addons.go:530] duration metric: took 1m57.123269017s for enable addons: enabled=[]
	I1217 20:27:45.740623  414292 type.go:168] "Request Body" body=""
	I1217 20:27:45.740707  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:45.741048  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:46.240887  414292 type.go:168] "Request Body" body=""
	I1217 20:27:46.240956  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:46.241257  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:46.741061  414292 type.go:168] "Request Body" body=""
	I1217 20:27:46.741135  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:46.741496  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:47.240179  414292 type.go:168] "Request Body" body=""
	I1217 20:27:47.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:47.240598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:47.740172  414292 type.go:168] "Request Body" body=""
	I1217 20:27:47.745570  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:47.746779  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:47.746914  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:48.240563  414292 type.go:168] "Request Body" body=""
	I1217 20:27:48.240642  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:48.240990  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:48.740796  414292 type.go:168] "Request Body" body=""
	I1217 20:27:48.740881  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:48.741219  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:49.240326  414292 type.go:168] "Request Body" body=""
	I1217 20:27:49.240395  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:49.240661  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:49.740181  414292 type.go:168] "Request Body" body=""
	I1217 20:27:49.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:49.740594  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:50.240191  414292 type.go:168] "Request Body" body=""
	I1217 20:27:50.240286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:50.240605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:50.240662  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:50.741072  414292 type.go:168] "Request Body" body=""
	I1217 20:27:50.741139  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:50.741398  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:51.240102  414292 type.go:168] "Request Body" body=""
	I1217 20:27:51.240183  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:51.240525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:51.740138  414292 type.go:168] "Request Body" body=""
	I1217 20:27:51.740212  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:51.740573  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:52.240123  414292 type.go:168] "Request Body" body=""
	I1217 20:27:52.240196  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:52.240556  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:52.740167  414292 type.go:168] "Request Body" body=""
	I1217 20:27:52.740295  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:52.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:52.740683  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:53.240363  414292 type.go:168] "Request Body" body=""
	I1217 20:27:53.240445  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:53.240776  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:53.740345  414292 type.go:168] "Request Body" body=""
	I1217 20:27:53.740444  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:53.740732  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:54.240671  414292 type.go:168] "Request Body" body=""
	I1217 20:27:54.240743  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:54.241055  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:54.740834  414292 type.go:168] "Request Body" body=""
	I1217 20:27:54.740911  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:54.741241  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:54.741301  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:55.241020  414292 type.go:168] "Request Body" body=""
	I1217 20:27:55.241088  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:55.241340  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:55.740115  414292 type.go:168] "Request Body" body=""
	I1217 20:27:55.740205  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:55.740573  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:56.240196  414292 type.go:168] "Request Body" body=""
	I1217 20:27:56.240298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:56.240618  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:56.740302  414292 type.go:168] "Request Body" body=""
	I1217 20:27:56.740375  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:56.740674  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:57.240163  414292 type.go:168] "Request Body" body=""
	I1217 20:27:57.240242  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:57.240572  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:57.240625  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:57.740326  414292 type.go:168] "Request Body" body=""
	I1217 20:27:57.740414  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:57.740758  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:58.240606  414292 type.go:168] "Request Body" body=""
	I1217 20:27:58.240676  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:58.240930  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:58.740703  414292 type.go:168] "Request Body" body=""
	I1217 20:27:58.740776  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:58.741128  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:59.240930  414292 type.go:168] "Request Body" body=""
	I1217 20:27:59.241008  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:59.241330  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:59.241390  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:59.741097  414292 type.go:168] "Request Body" body=""
	I1217 20:27:59.741170  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:59.741444  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:00.240277  414292 type.go:168] "Request Body" body=""
	I1217 20:28:00.240360  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:00.240691  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:00.740372  414292 type.go:168] "Request Body" body=""
	I1217 20:28:00.740453  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:00.740814  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:01.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:28:01.240448  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:01.240754  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:01.740452  414292 type.go:168] "Request Body" body=""
	I1217 20:28:01.740539  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:01.740931  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:01.740984  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:02.240802  414292 type.go:168] "Request Body" body=""
	I1217 20:28:02.240878  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:02.241186  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:02.740889  414292 type.go:168] "Request Body" body=""
	I1217 20:28:02.740958  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:02.741285  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:03.241130  414292 type.go:168] "Request Body" body=""
	I1217 20:28:03.241210  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:03.241568  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:03.740205  414292 type.go:168] "Request Body" body=""
	I1217 20:28:03.740305  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:03.740675  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:04.240435  414292 type.go:168] "Request Body" body=""
	I1217 20:28:04.240501  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:04.240759  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:04.240799  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:04.740206  414292 type.go:168] "Request Body" body=""
	I1217 20:28:04.740298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:04.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:05.240365  414292 type.go:168] "Request Body" body=""
	I1217 20:28:05.240442  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:05.240770  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:05.740360  414292 type.go:168] "Request Body" body=""
	I1217 20:28:05.740435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:05.740725  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:06.240330  414292 type.go:168] "Request Body" body=""
	I1217 20:28:06.240409  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:06.240732  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:06.740442  414292 type.go:168] "Request Body" body=""
	I1217 20:28:06.740525  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:06.740853  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:06.740914  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:07.240349  414292 type.go:168] "Request Body" body=""
	I1217 20:28:07.240423  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:07.240678  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:07.740162  414292 type.go:168] "Request Body" body=""
	I1217 20:28:07.740243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:07.740585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:08.240506  414292 type.go:168] "Request Body" body=""
	I1217 20:28:08.240587  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:08.240906  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:08.740708  414292 type.go:168] "Request Body" body=""
	I1217 20:28:08.740811  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:08.741159  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:08.741223  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:09.241091  414292 type.go:168] "Request Body" body=""
	I1217 20:28:09.241169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:09.241495  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:09.740346  414292 type.go:168] "Request Body" body=""
	I1217 20:28:09.740424  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:09.740766  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:10.240377  414292 type.go:168] "Request Body" body=""
	I1217 20:28:10.240453  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:10.240750  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:10.740215  414292 type.go:168] "Request Body" body=""
	I1217 20:28:10.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:10.740678  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:11.240243  414292 type.go:168] "Request Body" body=""
	I1217 20:28:11.240342  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:11.240712  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:11.240767  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:11.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:28:11.740446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:11.740716  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:12.240420  414292 type.go:168] "Request Body" body=""
	I1217 20:28:12.240502  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:12.240833  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:12.740560  414292 type.go:168] "Request Body" body=""
	I1217 20:28:12.740634  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:12.740952  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:13.240720  414292 type.go:168] "Request Body" body=""
	I1217 20:28:13.240805  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:13.241066  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:13.241122  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:13.740900  414292 type.go:168] "Request Body" body=""
	I1217 20:28:13.740983  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:13.741356  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:14.240146  414292 type.go:168] "Request Body" body=""
	I1217 20:28:14.240277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:14.240625  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:14.740140  414292 type.go:168] "Request Body" body=""
	I1217 20:28:14.740209  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:14.740516  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:15.240203  414292 type.go:168] "Request Body" body=""
	I1217 20:28:15.240304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:15.240633  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:15.740361  414292 type.go:168] "Request Body" body=""
	I1217 20:28:15.740447  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:15.740780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:15.740840  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:16.240359  414292 type.go:168] "Request Body" body=""
	I1217 20:28:16.240436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:16.240823  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:16.740505  414292 type.go:168] "Request Body" body=""
	I1217 20:28:16.740583  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:16.740911  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:17.240693  414292 type.go:168] "Request Body" body=""
	I1217 20:28:17.240772  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:17.241095  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:17.740839  414292 type.go:168] "Request Body" body=""
	I1217 20:28:17.740925  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:17.741192  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:17.741241  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:18.241106  414292 type.go:168] "Request Body" body=""
	I1217 20:28:18.241186  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:18.241520  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:18.740223  414292 type.go:168] "Request Body" body=""
	I1217 20:28:18.740344  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:18.740693  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:19.240508  414292 type.go:168] "Request Body" body=""
	I1217 20:28:19.240581  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:19.240841  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:19.740191  414292 type.go:168] "Request Body" body=""
	I1217 20:28:19.740291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:19.740610  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:20.240363  414292 type.go:168] "Request Body" body=""
	I1217 20:28:20.240438  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:20.240765  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:20.240823  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:20.740348  414292 type.go:168] "Request Body" body=""
	I1217 20:28:20.740426  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:20.740691  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:21.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:28:21.240298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:21.240640  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:21.740375  414292 type.go:168] "Request Body" body=""
	I1217 20:28:21.740465  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:21.740819  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:22.240416  414292 type.go:168] "Request Body" body=""
	I1217 20:28:22.240484  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:22.240741  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:22.740166  414292 type.go:168] "Request Body" body=""
	I1217 20:28:22.740262  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:22.740592  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:22.740654  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:23.240360  414292 type.go:168] "Request Body" body=""
	I1217 20:28:23.240441  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:23.240798  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:23.740367  414292 type.go:168] "Request Body" body=""
	I1217 20:28:23.740434  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:23.740759  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:24.240649  414292 type.go:168] "Request Body" body=""
	I1217 20:28:24.240730  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:24.241071  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:24.740869  414292 type.go:168] "Request Body" body=""
	I1217 20:28:24.740949  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:24.741289  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:24.741351  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:25.241053  414292 type.go:168] "Request Body" body=""
	I1217 20:28:25.241120  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:25.241378  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:25.740052  414292 type.go:168] "Request Body" body=""
	I1217 20:28:25.740126  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:25.740478  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:26.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:28:26.240294  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:26.240602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:26.740121  414292 type.go:168] "Request Body" body=""
	I1217 20:28:26.740197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:26.740507  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:27.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:28:27.240284  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:27.240622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:27.240679  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:27.740368  414292 type.go:168] "Request Body" body=""
	I1217 20:28:27.740447  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:27.740790  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:28.240689  414292 type.go:168] "Request Body" body=""
	I1217 20:28:28.240769  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:28.241066  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:28.740814  414292 type.go:168] "Request Body" body=""
	I1217 20:28:28.740891  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:28.741191  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:29.240973  414292 type.go:168] "Request Body" body=""
	I1217 20:28:29.241045  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:29.241401  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:29.241453  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:29.740076  414292 type.go:168] "Request Body" body=""
	I1217 20:28:29.740154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:29.740474  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:30.240176  414292 type.go:168] "Request Body" body=""
	I1217 20:28:30.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:30.240673  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:30.740407  414292 type.go:168] "Request Body" body=""
	I1217 20:28:30.740484  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:30.740834  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:31.240344  414292 type.go:168] "Request Body" body=""
	I1217 20:28:31.240421  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:31.240680  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:31.740183  414292 type.go:168] "Request Body" body=""
	I1217 20:28:31.740278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:31.740615  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:31.740674  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:32.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:28:32.240291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:32.240620  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:32.740110  414292 type.go:168] "Request Body" body=""
	I1217 20:28:32.740177  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:32.740450  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:33.240131  414292 type.go:168] "Request Body" body=""
	I1217 20:28:33.240205  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:33.240522  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:33.740095  414292 type.go:168] "Request Body" body=""
	I1217 20:28:33.740169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:33.740516  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:34.240083  414292 type.go:168] "Request Body" body=""
	I1217 20:28:34.240169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:34.240471  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:34.240522  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:34.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:28:34.740277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:34.740604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:35.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:28:35.240295  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:35.240596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:35.740083  414292 type.go:168] "Request Body" body=""
	I1217 20:28:35.740163  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:35.740501  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:36.240147  414292 type.go:168] "Request Body" body=""
	I1217 20:28:36.240220  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:36.240565  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:36.240620  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:36.740181  414292 type.go:168] "Request Body" body=""
	I1217 20:28:36.740270  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:36.740569  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:37.240126  414292 type.go:168] "Request Body" body=""
	I1217 20:28:37.240206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:37.240510  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:37.740120  414292 type.go:168] "Request Body" body=""
	I1217 20:28:37.740203  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:37.740533  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:38.240393  414292 type.go:168] "Request Body" body=""
	I1217 20:28:38.240473  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:38.240804  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:38.240859  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:38.740418  414292 type.go:168] "Request Body" body=""
	I1217 20:28:38.740493  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:38.740792  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:39.240679  414292 type.go:168] "Request Body" body=""
	I1217 20:28:39.240778  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:39.241109  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:39.740934  414292 type.go:168] "Request Body" body=""
	I1217 20:28:39.741010  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:39.741373  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:40.240097  414292 type.go:168] "Request Body" body=""
	I1217 20:28:40.240172  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:40.240452  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:40.740149  414292 type.go:168] "Request Body" body=""
	I1217 20:28:40.740221  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:40.740580  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:40.740634  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:41.240170  414292 type.go:168] "Request Body" body=""
	I1217 20:28:41.240273  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:41.240600  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:41.740062  414292 type.go:168] "Request Body" body=""
	I1217 20:28:41.740137  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:41.740430  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:42.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:28:42.240290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:42.240668  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:42.740377  414292 type.go:168] "Request Body" body=""
	I1217 20:28:42.740455  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:42.740779  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:42.740831  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:43.240367  414292 type.go:168] "Request Body" body=""
	I1217 20:28:43.240476  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:43.240764  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:43.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:28:43.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:43.740560  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:44.240459  414292 type.go:168] "Request Body" body=""
	I1217 20:28:44.240534  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:44.240870  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:44.740354  414292 type.go:168] "Request Body" body=""
	I1217 20:28:44.740429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:44.740695  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:45.240439  414292 type.go:168] "Request Body" body=""
	I1217 20:28:45.240568  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:45.241041  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:45.241102  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:45.740871  414292 type.go:168] "Request Body" body=""
	I1217 20:28:45.740956  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:45.741304  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:46.241069  414292 type.go:168] "Request Body" body=""
	I1217 20:28:46.241138  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:46.241455  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:46.740186  414292 type.go:168] "Request Body" body=""
	I1217 20:28:46.740277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:46.740602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:47.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:28:47.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:47.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:47.740189  414292 type.go:168] "Request Body" body=""
	I1217 20:28:47.740282  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:47.740601  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:47.740657  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:48.240600  414292 type.go:168] "Request Body" body=""
	I1217 20:28:48.240678  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:48.241014  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:48.740811  414292 type.go:168] "Request Body" body=""
	I1217 20:28:48.740888  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:48.741185  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:49.240973  414292 type.go:168] "Request Body" body=""
	I1217 20:28:49.241051  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:49.241327  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:49.741121  414292 type.go:168] "Request Body" body=""
	I1217 20:28:49.741215  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:49.741596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:49.741649  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:50.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:28:50.240243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:50.240599  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:50.740121  414292 type.go:168] "Request Body" body=""
	I1217 20:28:50.740197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:50.740508  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:51.240167  414292 type.go:168] "Request Body" body=""
	I1217 20:28:51.240239  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:51.240596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:51.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:28:51.740310  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:51.740686  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:52.240373  414292 type.go:168] "Request Body" body=""
	I1217 20:28:52.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:52.240709  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:52.240761  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:52.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:28:52.740289  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:52.740584  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:53.240185  414292 type.go:168] "Request Body" body=""
	I1217 20:28:53.240278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:53.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:53.740309  414292 type.go:168] "Request Body" body=""
	I1217 20:28:53.740380  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:53.740678  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:54.240655  414292 type.go:168] "Request Body" body=""
	I1217 20:28:54.240729  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:54.241066  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:54.241123  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:54.740804  414292 type.go:168] "Request Body" body=""
	I1217 20:28:54.740895  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:54.741266  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:55.241066  414292 type.go:168] "Request Body" body=""
	I1217 20:28:55.241142  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:55.241410  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:55.740129  414292 type.go:168] "Request Body" body=""
	I1217 20:28:55.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:55.740615  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:56.240178  414292 type.go:168] "Request Body" body=""
	I1217 20:28:56.240274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:56.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:56.740206  414292 type.go:168] "Request Body" body=""
	I1217 20:28:56.740289  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:56.740613  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:56.740664  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:57.240172  414292 type.go:168] "Request Body" body=""
	I1217 20:28:57.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:57.240598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:57.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:28:57.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:57.740644  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:58.240496  414292 type.go:168] "Request Body" body=""
	I1217 20:28:58.240566  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:58.240825  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:58.740558  414292 type.go:168] "Request Body" body=""
	I1217 20:28:58.740638  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:58.740975  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:58.741026  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:59.240760  414292 type.go:168] "Request Body" body=""
	I1217 20:28:59.240835  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:59.241171  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:59.740992  414292 type.go:168] "Request Body" body=""
	I1217 20:28:59.741067  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:59.741325  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:00.241179  414292 type.go:168] "Request Body" body=""
	I1217 20:29:00.241274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:00.241594  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:00.740545  414292 type.go:168] "Request Body" body=""
	I1217 20:29:00.740619  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:00.740922  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:01.240654  414292 type.go:168] "Request Body" body=""
	I1217 20:29:01.240729  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:01.241023  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:01.241072  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:01.740787  414292 type.go:168] "Request Body" body=""
	I1217 20:29:01.740865  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:01.741183  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:02.240971  414292 type.go:168] "Request Body" body=""
	I1217 20:29:02.241051  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:02.241393  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:02.740980  414292 type.go:168] "Request Body" body=""
	I1217 20:29:02.741059  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:02.741391  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:03.240071  414292 type.go:168] "Request Body" body=""
	I1217 20:29:03.240147  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:03.240491  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:03.740095  414292 type.go:168] "Request Body" body=""
	I1217 20:29:03.740172  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:03.740538  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:03.740593  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:04.240122  414292 type.go:168] "Request Body" body=""
	I1217 20:29:04.240206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:04.240574  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:04.740282  414292 type.go:168] "Request Body" body=""
	I1217 20:29:04.740362  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:04.740712  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:05.240168  414292 type.go:168] "Request Body" body=""
	I1217 20:29:05.240264  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:05.240599  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:05.740352  414292 type.go:168] "Request Body" body=""
	I1217 20:29:05.740431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:05.740697  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:05.740738  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:06.240370  414292 type.go:168] "Request Body" body=""
	I1217 20:29:06.240445  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:06.240788  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:06.741116  414292 type.go:168] "Request Body" body=""
	I1217 20:29:06.741191  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:06.741539  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:07.240080  414292 type.go:168] "Request Body" body=""
	I1217 20:29:07.240155  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:07.240463  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:07.740201  414292 type.go:168] "Request Body" body=""
	I1217 20:29:07.740303  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:07.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:08.240575  414292 type.go:168] "Request Body" body=""
	I1217 20:29:08.240648  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:08.241002  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:08.241060  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:08.740757  414292 type.go:168] "Request Body" body=""
	I1217 20:29:08.740826  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:08.741089  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:09.241059  414292 type.go:168] "Request Body" body=""
	I1217 20:29:09.241165  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:09.241510  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:09.740220  414292 type.go:168] "Request Body" body=""
	I1217 20:29:09.740308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:09.740656  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:10.240344  414292 type.go:168] "Request Body" body=""
	I1217 20:29:10.240444  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:10.240756  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:10.740205  414292 type.go:168] "Request Body" body=""
	I1217 20:29:10.740293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:10.740629  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:10.740685  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:11.240376  414292 type.go:168] "Request Body" body=""
	I1217 20:29:11.240454  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:11.240798  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:11.740377  414292 type.go:168] "Request Body" body=""
	I1217 20:29:11.740475  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:11.740809  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:12.240548  414292 type.go:168] "Request Body" body=""
	I1217 20:29:12.240621  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:12.240958  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:12.740753  414292 type.go:168] "Request Body" body=""
	I1217 20:29:12.740831  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:12.741131  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:12.741180  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:13.240743  414292 type.go:168] "Request Body" body=""
	I1217 20:29:13.240816  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:13.241082  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:13.740877  414292 type.go:168] "Request Body" body=""
	I1217 20:29:13.740957  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:13.741251  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:14.241067  414292 type.go:168] "Request Body" body=""
	I1217 20:29:14.241141  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:14.241456  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:14.740157  414292 type.go:168] "Request Body" body=""
	I1217 20:29:14.740261  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:14.740657  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:15.240363  414292 type.go:168] "Request Body" body=""
	I1217 20:29:15.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:15.240796  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:15.240849  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:15.740520  414292 type.go:168] "Request Body" body=""
	I1217 20:29:15.740600  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:15.740926  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:16.240670  414292 type.go:168] "Request Body" body=""
	I1217 20:29:16.240741  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:16.241005  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:16.740812  414292 type.go:168] "Request Body" body=""
	I1217 20:29:16.740895  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:16.741250  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:17.241030  414292 type.go:168] "Request Body" body=""
	I1217 20:29:17.241104  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:17.241485  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:17.241544  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:17.741007  414292 type.go:168] "Request Body" body=""
	I1217 20:29:17.741075  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:17.741330  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:18.240351  414292 type.go:168] "Request Body" body=""
	I1217 20:29:18.240431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:18.240777  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:18.740148  414292 type.go:168] "Request Body" body=""
	I1217 20:29:18.740233  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:18.740593  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:19.240118  414292 type.go:168] "Request Body" body=""
	I1217 20:29:19.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:19.240527  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:19.740167  414292 type.go:168] "Request Body" body=""
	I1217 20:29:19.740241  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:19.740598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:19.740652  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:20.240357  414292 type.go:168] "Request Body" body=""
	I1217 20:29:20.240435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:20.240764  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:20.740360  414292 type.go:168] "Request Body" body=""
	I1217 20:29:20.740433  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:20.740702  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:21.240449  414292 type.go:168] "Request Body" body=""
	I1217 20:29:21.240532  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:21.240864  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:21.740699  414292 type.go:168] "Request Body" body=""
	I1217 20:29:21.740783  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:21.741119  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:21.741177  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:22.240878  414292 type.go:168] "Request Body" body=""
	I1217 20:29:22.240947  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:22.241200  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:22.740985  414292 type.go:168] "Request Body" body=""
	I1217 20:29:22.741058  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:22.741409  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:23.240121  414292 type.go:168] "Request Body" body=""
	I1217 20:29:23.240195  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:23.240546  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:23.740154  414292 type.go:168] "Request Body" body=""
	I1217 20:29:23.740239  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:23.740563  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:24.240534  414292 type.go:168] "Request Body" body=""
	I1217 20:29:24.240612  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:24.240947  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:24.241000  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:24.740777  414292 type.go:168] "Request Body" body=""
	I1217 20:29:24.740857  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:24.741204  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:25.240998  414292 type.go:168] "Request Body" body=""
	I1217 20:29:25.241066  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:25.241333  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:25.741123  414292 type.go:168] "Request Body" body=""
	I1217 20:29:25.741202  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:25.741530  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:26.240201  414292 type.go:168] "Request Body" body=""
	I1217 20:29:26.240299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:26.240642  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:26.740361  414292 type.go:168] "Request Body" body=""
	I1217 20:29:26.740432  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:26.740744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:26.740792  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:27.240174  414292 type.go:168] "Request Body" body=""
	I1217 20:29:27.240243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:27.240585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:27.740308  414292 type.go:168] "Request Body" body=""
	I1217 20:29:27.740392  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:27.740767  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:28.240517  414292 type.go:168] "Request Body" body=""
	I1217 20:29:28.240588  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:28.240857  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:28.740679  414292 type.go:168] "Request Body" body=""
	I1217 20:29:28.740752  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:28.741120  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:28.741173  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:29.240914  414292 type.go:168] "Request Body" body=""
	I1217 20:29:29.240995  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:29.241335  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:29.740969  414292 type.go:168] "Request Body" body=""
	I1217 20:29:29.741045  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:29.741327  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:30.240069  414292 type.go:168] "Request Body" body=""
	I1217 20:29:30.240150  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:30.240512  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:30.740197  414292 type.go:168] "Request Body" body=""
	I1217 20:29:30.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:30.740646  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:31.240121  414292 type.go:168] "Request Body" body=""
	I1217 20:29:31.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:31.240521  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:31.240571  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:31.740162  414292 type.go:168] "Request Body" body=""
	I1217 20:29:31.740238  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:31.740576  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:32.240165  414292 type.go:168] "Request Body" body=""
	I1217 20:29:32.240262  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:32.240572  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:32.740149  414292 type.go:168] "Request Body" body=""
	I1217 20:29:32.740217  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:32.740555  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:33.240266  414292 type.go:168] "Request Body" body=""
	I1217 20:29:33.240342  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:33.240665  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:33.240725  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:33.740182  414292 type.go:168] "Request Body" body=""
	I1217 20:29:33.740280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:33.740619  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:34.240111  414292 type.go:168] "Request Body" body=""
	I1217 20:29:34.240178  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:34.240456  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:34.740169  414292 type.go:168] "Request Body" body=""
	I1217 20:29:34.740275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:34.740676  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:35.240238  414292 type.go:168] "Request Body" body=""
	I1217 20:29:35.240333  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:35.240684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:35.740366  414292 type.go:168] "Request Body" body=""
	I1217 20:29:35.740431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:35.740683  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:35.740724  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:36.240205  414292 type.go:168] "Request Body" body=""
	I1217 20:29:36.240297  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:36.240641  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:36.740372  414292 type.go:168] "Request Body" body=""
	I1217 20:29:36.740448  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:36.740761  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:37.240379  414292 type.go:168] "Request Body" body=""
	I1217 20:29:37.240448  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:37.240766  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:37.740215  414292 type.go:168] "Request Body" body=""
	I1217 20:29:37.740301  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:37.740614  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:38.240590  414292 type.go:168] "Request Body" body=""
	I1217 20:29:38.240670  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:38.241007  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:38.241051  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:38.740806  414292 type.go:168] "Request Body" body=""
	I1217 20:29:38.740880  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:38.741145  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:39.240077  414292 type.go:168] "Request Body" body=""
	I1217 20:29:39.240158  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:39.240533  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:39.740138  414292 type.go:168] "Request Body" body=""
	I1217 20:29:39.740216  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:39.740575  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:40.240271  414292 type.go:168] "Request Body" body=""
	I1217 20:29:40.240352  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:40.240630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:40.740330  414292 type.go:168] "Request Body" body=""
	I1217 20:29:40.740414  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:40.740751  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:40.740807  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:41.240185  414292 type.go:168] "Request Body" body=""
	I1217 20:29:41.240278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:41.240603  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:41.740073  414292 type.go:168] "Request Body" body=""
	I1217 20:29:41.740152  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:41.740438  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:42.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:29:42.240310  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:42.240671  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:42.740385  414292 type.go:168] "Request Body" body=""
	I1217 20:29:42.740463  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:42.740790  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:42.740846  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:43.240351  414292 type.go:168] "Request Body" body=""
	I1217 20:29:43.240416  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:43.240662  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:43.740171  414292 type.go:168] "Request Body" body=""
	I1217 20:29:43.740242  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:43.740569  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:44.240470  414292 type.go:168] "Request Body" body=""
	I1217 20:29:44.240555  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:44.241028  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:44.740766  414292 type.go:168] "Request Body" body=""
	I1217 20:29:44.740836  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:44.741107  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:44.741148  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:45.241078  414292 type.go:168] "Request Body" body=""
	I1217 20:29:45.241164  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:45.241604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:45.740201  414292 type.go:168] "Request Body" body=""
	I1217 20:29:45.740298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:45.740651  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:46.240352  414292 type.go:168] "Request Body" body=""
	I1217 20:29:46.240425  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:46.240697  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:46.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:29:46.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:46.740656  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:47.240376  414292 type.go:168] "Request Body" body=""
	I1217 20:29:47.240462  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:47.240824  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:47.240891  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:47.740486  414292 type.go:168] "Request Body" body=""
	I1217 20:29:47.740555  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:47.740822  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:48.240800  414292 type.go:168] "Request Body" body=""
	I1217 20:29:48.240885  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:48.241255  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:48.741124  414292 type.go:168] "Request Body" body=""
	I1217 20:29:48.741203  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:48.741664  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:49.240445  414292 type.go:168] "Request Body" body=""
	I1217 20:29:49.240522  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:49.240794  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:49.743007  414292 type.go:168] "Request Body" body=""
	I1217 20:29:49.743084  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:49.743421  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:49.743497  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:50.241133  414292 type.go:168] "Request Body" body=""
	I1217 20:29:50.241229  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:50.241648  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:50.740153  414292 type.go:168] "Request Body" body=""
	I1217 20:29:50.740228  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:50.740573  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:51.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:29:51.240301  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:51.240639  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:51.740398  414292 type.go:168] "Request Body" body=""
	I1217 20:29:51.740482  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:51.740812  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:52.240354  414292 type.go:168] "Request Body" body=""
	I1217 20:29:52.240429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:52.240749  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:52.240807  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:52.740175  414292 type.go:168] "Request Body" body=""
	I1217 20:29:52.740270  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:52.740621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:53.240207  414292 type.go:168] "Request Body" body=""
	I1217 20:29:53.240318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:53.240664  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:53.740074  414292 type.go:168] "Request Body" body=""
	I1217 20:29:53.740144  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:53.740420  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:54.240102  414292 type.go:168] "Request Body" body=""
	I1217 20:29:54.240179  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:54.240492  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:54.740171  414292 type.go:168] "Request Body" body=""
	I1217 20:29:54.740266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:54.740580  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:54.740639  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:55.240282  414292 type.go:168] "Request Body" body=""
	I1217 20:29:55.240356  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:55.240697  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:55.740173  414292 type.go:168] "Request Body" body=""
	I1217 20:29:55.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:55.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:56.240062  414292 type.go:168] "Request Body" body=""
	I1217 20:29:56.240140  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:56.240485  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:56.740174  414292 type.go:168] "Request Body" body=""
	I1217 20:29:56.740243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:56.740524  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:57.240191  414292 type.go:168] "Request Body" body=""
	I1217 20:29:57.240286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:57.240658  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:57.240715  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:57.740239  414292 type.go:168] "Request Body" body=""
	I1217 20:29:57.740339  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:57.740672  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:58.240460  414292 type.go:168] "Request Body" body=""
	I1217 20:29:58.240543  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:58.240844  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:58.740210  414292 type.go:168] "Request Body" body=""
	I1217 20:29:58.740304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:58.740643  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:59.240481  414292 type.go:168] "Request Body" body=""
	I1217 20:29:59.240553  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:59.240919  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:59.240975  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:59.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:29:59.740429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:59.740722  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:00.240635  414292 type.go:168] "Request Body" body=""
	I1217 20:30:00.240720  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:00.241049  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:00.741056  414292 type.go:168] "Request Body" body=""
	I1217 20:30:00.741139  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:00.741446  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:01.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:30:01.240243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:01.240569  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:01.740198  414292 type.go:168] "Request Body" body=""
	I1217 20:30:01.740313  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:01.740647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:01.740706  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:02.240225  414292 type.go:168] "Request Body" body=""
	I1217 20:30:02.240325  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:02.240684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:02.740220  414292 type.go:168] "Request Body" body=""
	I1217 20:30:02.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:02.740599  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:03.240198  414292 type.go:168] "Request Body" body=""
	I1217 20:30:03.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:03.240638  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:03.740375  414292 type.go:168] "Request Body" body=""
	I1217 20:30:03.740447  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:03.740781  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:03.740851  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:04.240827  414292 type.go:168] "Request Body" body=""
	I1217 20:30:04.240905  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:04.241246  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:04.741038  414292 type.go:168] "Request Body" body=""
	I1217 20:30:04.741117  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:04.741482  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:05.240086  414292 type.go:168] "Request Body" body=""
	I1217 20:30:05.240166  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:05.240523  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:05.740218  414292 type.go:168] "Request Body" body=""
	I1217 20:30:05.740313  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:05.740583  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:06.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:30:06.240296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:06.240608  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:06.240657  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:06.740166  414292 type.go:168] "Request Body" body=""
	I1217 20:30:06.740292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:06.740641  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:07.240137  414292 type.go:168] "Request Body" body=""
	I1217 20:30:07.240232  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:07.240542  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:07.740212  414292 type.go:168] "Request Body" body=""
	I1217 20:30:07.740319  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:07.740677  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:08.240572  414292 type.go:168] "Request Body" body=""
	I1217 20:30:08.240703  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:08.241012  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:08.241060  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:08.740470  414292 type.go:168] "Request Body" body=""
	I1217 20:30:08.740547  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:08.740807  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:09.240838  414292 type.go:168] "Request Body" body=""
	I1217 20:30:09.240937  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:09.241278  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:09.741105  414292 type.go:168] "Request Body" body=""
	I1217 20:30:09.741196  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:09.741522  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:10.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:30:10.240237  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:10.240593  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:10.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:30:10.740286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:10.740639  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:10.740692  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:11.240192  414292 type.go:168] "Request Body" body=""
	I1217 20:30:11.240314  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:11.240666  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:11.740282  414292 type.go:168] "Request Body" body=""
	I1217 20:30:11.740357  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:11.740632  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:12.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:30:12.240296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:12.240602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:12.740302  414292 type.go:168] "Request Body" body=""
	I1217 20:30:12.740382  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:12.740745  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:12.740799  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:13.240268  414292 type.go:168] "Request Body" body=""
	I1217 20:30:13.240338  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:13.240649  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:13.740420  414292 type.go:168] "Request Body" body=""
	I1217 20:30:13.740503  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:13.740905  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:14.240722  414292 type.go:168] "Request Body" body=""
	I1217 20:30:14.240802  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:14.241096  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:14.740830  414292 type.go:168] "Request Body" body=""
	I1217 20:30:14.740904  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:14.741180  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:14.741236  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:15.241024  414292 type.go:168] "Request Body" body=""
	I1217 20:30:15.241104  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:15.241450  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:15.741152  414292 type.go:168] "Request Body" body=""
	I1217 20:30:15.741234  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:15.741523  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:16.240202  414292 type.go:168] "Request Body" body=""
	I1217 20:30:16.240287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:16.240602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:16.740191  414292 type.go:168] "Request Body" body=""
	I1217 20:30:16.740289  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:16.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:17.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:30:17.240275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:17.240605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:17.240659  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:17.740234  414292 type.go:168] "Request Body" body=""
	I1217 20:30:17.740318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:17.740651  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:18.240499  414292 type.go:168] "Request Body" body=""
	I1217 20:30:18.240576  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:18.240897  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:18.740209  414292 type.go:168] "Request Body" body=""
	I1217 20:30:18.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:18.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:19.240591  414292 type.go:168] "Request Body" body=""
	I1217 20:30:19.240664  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:19.240919  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:19.240962  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:19.740705  414292 type.go:168] "Request Body" body=""
	I1217 20:30:19.740783  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:19.741126  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:20.240841  414292 type.go:168] "Request Body" body=""
	I1217 20:30:20.240934  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:20.241283  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:20.741051  414292 type.go:168] "Request Body" body=""
	I1217 20:30:20.741126  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:20.741393  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:21.240114  414292 type.go:168] "Request Body" body=""
	I1217 20:30:21.240193  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:21.240534  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:21.740204  414292 type.go:168] "Request Body" body=""
	I1217 20:30:21.740298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:21.740636  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:21.740690  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:22.240114  414292 type.go:168] "Request Body" body=""
	I1217 20:30:22.240185  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:22.240483  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:22.740163  414292 type.go:168] "Request Body" body=""
	I1217 20:30:22.740266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:22.740598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:23.240329  414292 type.go:168] "Request Body" body=""
	I1217 20:30:23.240410  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:23.240708  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:23.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:30:23.740421  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:23.740715  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:23.740754  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:24.240749  414292 type.go:168] "Request Body" body=""
	I1217 20:30:24.240827  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:24.241165  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:24.740945  414292 type.go:168] "Request Body" body=""
	I1217 20:30:24.741018  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:24.741341  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:25.241107  414292 type.go:168] "Request Body" body=""
	I1217 20:30:25.241181  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:25.241513  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:25.740206  414292 type.go:168] "Request Body" body=""
	I1217 20:30:25.740297  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:25.740647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:26.240231  414292 type.go:168] "Request Body" body=""
	I1217 20:30:26.240323  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:26.240661  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:26.240712  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:26.740311  414292 type.go:168] "Request Body" body=""
	I1217 20:30:26.740386  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:26.740706  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:27.240152  414292 type.go:168] "Request Body" body=""
	I1217 20:30:27.240224  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:27.240591  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:27.740314  414292 type.go:168] "Request Body" body=""
	I1217 20:30:27.740399  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:27.740736  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:28.240510  414292 type.go:168] "Request Body" body=""
	I1217 20:30:28.240580  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:28.240845  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:28.240885  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:28.740684  414292 type.go:168] "Request Body" body=""
	I1217 20:30:28.740769  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:28.741122  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:29.240868  414292 type.go:168] "Request Body" body=""
	I1217 20:30:29.240943  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:29.241278  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:29.741010  414292 type.go:168] "Request Body" body=""
	I1217 20:30:29.741088  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:29.741347  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:30.240074  414292 type.go:168] "Request Body" body=""
	I1217 20:30:30.240156  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:30.240536  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:30.740270  414292 type.go:168] "Request Body" body=""
	I1217 20:30:30.740350  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:30.740682  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:30.740742  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:31.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:30:31.240426  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:31.240709  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:31.740175  414292 type.go:168] "Request Body" body=""
	I1217 20:30:31.740274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:31.740607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:32.240196  414292 type.go:168] "Request Body" body=""
	I1217 20:30:32.240290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:32.240644  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:32.740323  414292 type.go:168] "Request Body" body=""
	I1217 20:30:32.740399  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:32.740721  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:32.740777  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:33.240191  414292 type.go:168] "Request Body" body=""
	I1217 20:30:33.240282  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:33.240581  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:33.740289  414292 type.go:168] "Request Body" body=""
	I1217 20:30:33.740384  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:33.740725  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:34.240516  414292 type.go:168] "Request Body" body=""
	I1217 20:30:34.240601  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:34.240877  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:34.740166  414292 type.go:168] "Request Body" body=""
	I1217 20:30:34.740263  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:34.740598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:35.240329  414292 type.go:168] "Request Body" body=""
	I1217 20:30:35.240407  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:35.240744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:35.240807  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:35.740356  414292 type.go:168] "Request Body" body=""
	I1217 20:30:35.740430  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:35.740684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:36.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:30:36.240292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:36.240620  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:36.740204  414292 type.go:168] "Request Body" body=""
	I1217 20:30:36.740299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:36.740621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:37.240083  414292 type.go:168] "Request Body" body=""
	I1217 20:30:37.240154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:37.240449  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:37.740150  414292 type.go:168] "Request Body" body=""
	I1217 20:30:37.740225  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:37.740559  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:37.740619  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:38.240565  414292 type.go:168] "Request Body" body=""
	I1217 20:30:38.240642  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:38.240952  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:38.740678  414292 type.go:168] "Request Body" body=""
	I1217 20:30:38.740753  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:38.741008  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:39.241038  414292 type.go:168] "Request Body" body=""
	I1217 20:30:39.241115  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:39.241418  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:39.740131  414292 type.go:168] "Request Body" body=""
	I1217 20:30:39.740209  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:39.740536  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:40.240118  414292 type.go:168] "Request Body" body=""
	I1217 20:30:40.240199  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:40.240498  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:40.240543  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:40.740108  414292 type.go:168] "Request Body" body=""
	I1217 20:30:40.740188  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:40.740547  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:41.240309  414292 type.go:168] "Request Body" body=""
	I1217 20:30:41.240393  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:41.240720  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:41.740366  414292 type.go:168] "Request Body" body=""
	I1217 20:30:41.740436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:41.740701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:42.240206  414292 type.go:168] "Request Body" body=""
	I1217 20:30:42.240307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:42.240670  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:42.240729  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:42.740417  414292 type.go:168] "Request Body" body=""
	I1217 20:30:42.740498  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:42.740825  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:43.240360  414292 type.go:168] "Request Body" body=""
	I1217 20:30:43.240441  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:43.240782  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:43.740183  414292 type.go:168] "Request Body" body=""
	I1217 20:30:43.740276  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:43.740607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:44.240187  414292 type.go:168] "Request Body" body=""
	I1217 20:30:44.240292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:44.240612  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:44.740076  414292 type.go:168] "Request Body" body=""
	I1217 20:30:44.740144  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:44.740421  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:44.740464  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:45.240192  414292 type.go:168] "Request Body" body=""
	I1217 20:30:45.240361  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:45.240784  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:45.740213  414292 type.go:168] "Request Body" body=""
	I1217 20:30:45.740320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:45.740704  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:46.241036  414292 type.go:168] "Request Body" body=""
	I1217 20:30:46.241111  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:46.241363  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:46.741131  414292 type.go:168] "Request Body" body=""
	I1217 20:30:46.741206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:46.741497  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:46.741543  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:47.240089  414292 type.go:168] "Request Body" body=""
	I1217 20:30:47.240162  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:47.240507  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:47.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:30:47.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:47.740533  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:48.240579  414292 type.go:168] "Request Body" body=""
	I1217 20:30:48.240662  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:48.240968  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:48.740776  414292 type.go:168] "Request Body" body=""
	I1217 20:30:48.740848  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:48.741156  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:49.241082  414292 type.go:168] "Request Body" body=""
	I1217 20:30:49.241169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:49.241428  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:49.241479  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:49.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:30:49.740280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:49.740617  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:50.240370  414292 type.go:168] "Request Body" body=""
	I1217 20:30:50.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:50.240786  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:50.740363  414292 type.go:168] "Request Body" body=""
	I1217 20:30:50.740431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:50.740703  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:51.240208  414292 type.go:168] "Request Body" body=""
	I1217 20:30:51.240317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:51.240620  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:51.740346  414292 type.go:168] "Request Body" body=""
	I1217 20:30:51.740425  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:51.740761  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:51.740821  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:52.240214  414292 type.go:168] "Request Body" body=""
	I1217 20:30:52.240300  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:52.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:52.740220  414292 type.go:168] "Request Body" body=""
	I1217 20:30:52.740317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:52.740663  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:53.240242  414292 type.go:168] "Request Body" body=""
	I1217 20:30:53.240341  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:53.240677  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:53.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:30:53.740434  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:53.740762  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:54.240735  414292 type.go:168] "Request Body" body=""
	I1217 20:30:54.240807  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:54.241141  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:54.241194  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:54.740836  414292 type.go:168] "Request Body" body=""
	I1217 20:30:54.740927  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:54.741298  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:55.241043  414292 type.go:168] "Request Body" body=""
	I1217 20:30:55.241118  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:55.241396  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:55.740103  414292 type.go:168] "Request Body" body=""
	I1217 20:30:55.740188  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:55.740556  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:56.240179  414292 type.go:168] "Request Body" body=""
	I1217 20:30:56.240272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:56.240684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:56.740352  414292 type.go:168] "Request Body" body=""
	I1217 20:30:56.740424  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:56.740685  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:56.740726  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:57.240185  414292 type.go:168] "Request Body" body=""
	I1217 20:30:57.240275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:57.240613  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:57.740358  414292 type.go:168] "Request Body" body=""
	I1217 20:30:57.740443  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:57.740790  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:58.240510  414292 type.go:168] "Request Body" body=""
	I1217 20:30:58.240581  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:58.240843  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:58.740171  414292 type.go:168] "Request Body" body=""
	I1217 20:30:58.740269  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:58.740606  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:59.240200  414292 type.go:168] "Request Body" body=""
	I1217 20:30:59.240287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:59.240645  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:59.240707  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:59.740354  414292 type.go:168] "Request Body" body=""
	I1217 20:30:59.740428  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:59.740694  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:00.240295  414292 type.go:168] "Request Body" body=""
	I1217 20:31:00.240376  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:00.240702  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:00.740561  414292 type.go:168] "Request Body" body=""
	I1217 20:31:00.740646  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:00.741014  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:01.240812  414292 type.go:168] "Request Body" body=""
	I1217 20:31:01.240892  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:01.241169  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:01.241213  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:01.740960  414292 type.go:168] "Request Body" body=""
	I1217 20:31:01.741043  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:01.741371  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:02.240103  414292 type.go:168] "Request Body" body=""
	I1217 20:31:02.240187  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:02.240566  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:02.740227  414292 type.go:168] "Request Body" body=""
	I1217 20:31:02.740315  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:02.740637  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:03.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:31:03.240291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:03.240590  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:03.740279  414292 type.go:168] "Request Body" body=""
	I1217 20:31:03.740349  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:03.740684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:03.740743  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:04.240506  414292 type.go:168] "Request Body" body=""
	I1217 20:31:04.240579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:04.240829  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:04.740200  414292 type.go:168] "Request Body" body=""
	I1217 20:31:04.740299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:04.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:05.240188  414292 type.go:168] "Request Body" body=""
	I1217 20:31:05.240285  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:05.240600  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:05.740119  414292 type.go:168] "Request Body" body=""
	I1217 20:31:05.740198  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:05.740527  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:06.240220  414292 type.go:168] "Request Body" body=""
	I1217 20:31:06.240320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:06.240652  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:06.240704  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:06.740395  414292 type.go:168] "Request Body" body=""
	I1217 20:31:06.740474  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:06.740826  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:07.240362  414292 type.go:168] "Request Body" body=""
	I1217 20:31:07.240437  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:07.240699  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:07.740382  414292 type.go:168] "Request Body" body=""
	I1217 20:31:07.740456  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:07.740780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:08.240694  414292 type.go:168] "Request Body" body=""
	I1217 20:31:08.240775  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:08.241125  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:08.241178  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:08.740933  414292 type.go:168] "Request Body" body=""
	I1217 20:31:08.741009  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:08.741272  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:09.240107  414292 type.go:168] "Request Body" body=""
	I1217 20:31:09.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:09.240509  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:09.740211  414292 type.go:168] "Request Body" body=""
	I1217 20:31:09.740317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:09.740653  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:10.240147  414292 type.go:168] "Request Body" body=""
	I1217 20:31:10.240221  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:10.240567  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:10.740283  414292 type.go:168] "Request Body" body=""
	I1217 20:31:10.740362  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:10.740720  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:10.740781  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:11.240308  414292 type.go:168] "Request Body" body=""
	I1217 20:31:11.240387  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:11.240742  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:11.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:31:11.740427  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:11.740686  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:12.240168  414292 type.go:168] "Request Body" body=""
	I1217 20:31:12.240265  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:12.240582  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:12.740305  414292 type.go:168] "Request Body" body=""
	I1217 20:31:12.740382  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:12.740717  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:13.240360  414292 type.go:168] "Request Body" body=""
	I1217 20:31:13.240435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:13.240753  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:13.240804  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:13.740494  414292 type.go:168] "Request Body" body=""
	I1217 20:31:13.740566  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:13.740865  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:14.240695  414292 type.go:168] "Request Body" body=""
	I1217 20:31:14.240775  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:14.241120  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:14.740883  414292 type.go:168] "Request Body" body=""
	I1217 20:31:14.740949  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:14.741207  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:15.241001  414292 type.go:168] "Request Body" body=""
	I1217 20:31:15.241086  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:15.241424  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:15.241480  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:15.740181  414292 type.go:168] "Request Body" body=""
	I1217 20:31:15.740286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:15.740606  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:16.240120  414292 type.go:168] "Request Body" body=""
	I1217 20:31:16.240190  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:16.240504  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:16.740165  414292 type.go:168] "Request Body" body=""
	I1217 20:31:16.740263  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:16.740588  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:17.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:31:17.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:17.240612  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:17.740176  414292 type.go:168] "Request Body" body=""
	I1217 20:31:17.740245  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:17.740595  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:17.740646  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:18.240578  414292 type.go:168] "Request Body" body=""
	I1217 20:31:18.240656  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:18.241010  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:18.740197  414292 type.go:168] "Request Body" body=""
	I1217 20:31:18.740299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:18.740622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:19.240589  414292 type.go:168] "Request Body" body=""
	I1217 20:31:19.240665  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:19.240973  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:19.740527  414292 type.go:168] "Request Body" body=""
	I1217 20:31:19.740602  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:19.740938  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:19.741004  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:20.240746  414292 type.go:168] "Request Body" body=""
	I1217 20:31:20.240826  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:20.241171  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:20.740847  414292 type.go:168] "Request Body" body=""
	I1217 20:31:20.740927  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:20.741205  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:21.240999  414292 type.go:168] "Request Body" body=""
	I1217 20:31:21.241077  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:21.241433  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:21.741076  414292 type.go:168] "Request Body" body=""
	I1217 20:31:21.741157  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:21.741476  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:21.741535  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:22.240148  414292 type.go:168] "Request Body" body=""
	I1217 20:31:22.240225  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:22.240595  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:22.740191  414292 type.go:168] "Request Body" body=""
	I1217 20:31:22.740288  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:22.740671  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:23.240361  414292 type.go:168] "Request Body" body=""
	I1217 20:31:23.240439  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:23.240766  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:23.740367  414292 type.go:168] "Request Body" body=""
	I1217 20:31:23.740436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:23.740727  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:24.240746  414292 type.go:168] "Request Body" body=""
	I1217 20:31:24.240834  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:24.241219  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:24.241275  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:24.740840  414292 type.go:168] "Request Body" body=""
	I1217 20:31:24.740915  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:24.741224  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:25.240994  414292 type.go:168] "Request Body" body=""
	I1217 20:31:25.241066  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:25.241325  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:25.741146  414292 type.go:168] "Request Body" body=""
	I1217 20:31:25.741238  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:25.741600  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:26.240173  414292 type.go:168] "Request Body" body=""
	I1217 20:31:26.240274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:26.240566  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:26.740222  414292 type.go:168] "Request Body" body=""
	I1217 20:31:26.740302  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:26.740563  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:26.740604  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:27.240174  414292 type.go:168] "Request Body" body=""
	I1217 20:31:27.240265  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:27.240586  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:27.740174  414292 type.go:168] "Request Body" body=""
	I1217 20:31:27.740267  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:27.740588  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:28.240508  414292 type.go:168] "Request Body" body=""
	I1217 20:31:28.240579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:28.240847  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:28.740580  414292 type.go:168] "Request Body" body=""
	I1217 20:31:28.740654  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:28.740974  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:28.741030  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:29.240927  414292 type.go:168] "Request Body" body=""
	I1217 20:31:29.241003  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:29.241345  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:29.740932  414292 type.go:168] "Request Body" body=""
	I1217 20:31:29.741003  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:29.741297  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:30.240066  414292 type.go:168] "Request Body" body=""
	I1217 20:31:30.240144  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:30.240477  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:30.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:31:30.740297  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:30.740655  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:31.240152  414292 type.go:168] "Request Body" body=""
	I1217 20:31:31.240227  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:31.240525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:31.240572  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:31.740177  414292 type.go:168] "Request Body" body=""
	I1217 20:31:31.740274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:31.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:32.240364  414292 type.go:168] "Request Body" body=""
	I1217 20:31:32.240441  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:32.240793  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:32.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:31:32.740429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:32.740739  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:33.240174  414292 type.go:168] "Request Body" body=""
	I1217 20:31:33.240265  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:33.240586  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:33.240635  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:33.740239  414292 type.go:168] "Request Body" body=""
	I1217 20:31:33.740336  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:33.740654  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:34.240597  414292 type.go:168] "Request Body" body=""
	I1217 20:31:34.240677  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:34.240945  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:34.740715  414292 type.go:168] "Request Body" body=""
	I1217 20:31:34.740794  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:34.741113  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:35.240931  414292 type.go:168] "Request Body" body=""
	I1217 20:31:35.241005  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:35.241378  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:35.241431  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:35.740086  414292 type.go:168] "Request Body" body=""
	I1217 20:31:35.740156  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:35.740458  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:36.240166  414292 type.go:168] "Request Body" body=""
	I1217 20:31:36.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:36.240589  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:36.740185  414292 type.go:168] "Request Body" body=""
	I1217 20:31:36.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:36.740585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:37.240233  414292 type.go:168] "Request Body" body=""
	I1217 20:31:37.240320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:37.240565  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:37.740177  414292 type.go:168] "Request Body" body=""
	I1217 20:31:37.740273  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:37.740567  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:37.740616  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:38.240625  414292 type.go:168] "Request Body" body=""
	I1217 20:31:38.240697  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:38.241070  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:38.740867  414292 type.go:168] "Request Body" body=""
	I1217 20:31:38.740936  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:38.741194  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:39.240063  414292 type.go:168] "Request Body" body=""
	I1217 20:31:39.240204  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:39.240542  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:39.740275  414292 type.go:168] "Request Body" body=""
	I1217 20:31:39.740351  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:39.740669  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:39.740728  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:40.240374  414292 type.go:168] "Request Body" body=""
	I1217 20:31:40.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:40.240701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:40.740211  414292 type.go:168] "Request Body" body=""
	I1217 20:31:40.740308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:40.740679  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:41.240409  414292 type.go:168] "Request Body" body=""
	I1217 20:31:41.240499  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:41.240858  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:41.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:31:41.740455  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:41.740717  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:41.740768  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:42.240201  414292 type.go:168] "Request Body" body=""
	I1217 20:31:42.240311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:42.240703  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:42.740571  414292 type.go:168] "Request Body" body=""
	I1217 20:31:42.740645  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:42.740967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:43.240727  414292 type.go:168] "Request Body" body=""
	I1217 20:31:43.240796  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:43.241050  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:43.740827  414292 type.go:168] "Request Body" body=""
	I1217 20:31:43.740901  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:43.741236  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:43.741293  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:44.241091  414292 type.go:168] "Request Body" body=""
	I1217 20:31:44.241176  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:44.241525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:44.740194  414292 type.go:168] "Request Body" body=""
	I1217 20:31:44.740280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:44.745967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=5
	I1217 20:31:45.240798  414292 type.go:168] "Request Body" body=""
	I1217 20:31:45.240901  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:45.241310  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:45.741143  414292 type.go:168] "Request Body" body=""
	I1217 20:31:45.741226  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:45.741583  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:45.741646  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:46.241073  414292 type.go:168] "Request Body" body=""
	I1217 20:31:46.241146  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:46.241399  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:46.740173  414292 type.go:168] "Request Body" body=""
	I1217 20:31:46.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:46.740602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:47.240190  414292 type.go:168] "Request Body" body=""
	I1217 20:31:47.240283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:47.240589  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:47.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:31:47.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:47.740649  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:48.240470  414292 type.go:168] "Request Body" body=""
	I1217 20:31:48.240554  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:48.241013  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:48.241064  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:48.740173  414292 type.go:168] "Request Body" body=""
	I1217 20:31:48.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:48.740603  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:49.240608  414292 type.go:168] "Request Body" body=""
	I1217 20:31:49.240675  414292 node_ready.go:38] duration metric: took 6m0.000721639s for node "functional-682596" to be "Ready" ...
	I1217 20:31:49.243794  414292 out.go:203] 
	W1217 20:31:49.246551  414292 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1217 20:31:49.246575  414292 out.go:285] * 
	* 
	W1217 20:31:49.249079  414292 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1217 20:31:49.251429  414292 out.go:203] 

                                                
                                                
** /stderr **
functional_test.go:676: failed to soft start minikube. args "out/minikube-linux-arm64 start -p functional-682596 --alsologtostderr -v=8": exit status 80
functional_test.go:678: soft start took 6m5.77491071s for "functional-682596" cluster.
I1217 20:31:49.773494  369461 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-682596
helpers_test.go:244: (dbg) docker inspect functional-682596:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	        "Created": "2025-12-17T20:17:26.774929696Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 408854,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-17T20:17:26.844564666Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:2a6398fc76fc21dc0a77ac54600c2604c101bff52e66ecf65f88ec0f1a8cff2d",
	        "ResolvConfPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hostname",
	        "HostsPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hosts",
	        "LogPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77-json.log",
	        "Name": "/functional-682596",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-682596:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-682596",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	                "LowerDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268-init/diff:/var/lib/docker/overlay2/83c8e6311894730d80a5439b5d4991744e9cfa6d0015df9caca346d57baf92e8/diff",
	                "MergedDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/merged",
	                "UpperDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/diff",
	                "WorkDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-682596",
	                "Source": "/var/lib/docker/volumes/functional-682596/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-682596",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-682596",
	                "name.minikube.sigs.k8s.io": "functional-682596",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "8e0f8d4915f888f90df7adb000bd0e749885d304e33053e85751193487b627b9",
	            "SandboxKey": "/var/run/docker/netns/8e0f8d4915f8",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33163"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33164"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33167"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33165"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33166"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-682596": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "de:95:c1:d9:d4:32",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "9e66e4dbc8284f728f81715f37c51d8272e96fcac9fb378874c982b3077b6cc2",
	                    "EndpointID": "0db3c56cfb2be75c981ed53adcc07de7cd33db60d51c01b0e875c8d41cf02897",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-682596",
	                        "efc9468a7e55"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596: exit status 2 (374.348548ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh            │ functional-032730 ssh sudo cat /usr/share/ca-certificates/3694612.pem                                                                                           │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ ssh            │ functional-032730 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                        │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls                                                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ ssh            │ functional-032730 ssh sudo cat /etc/test/nested/copy/369461/hosts                                                                                               │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image load --daemon kicbase/echo-server:functional-032730 --alsologtostderr                                                                   │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls                                                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image save kicbase/echo-server:functional-032730 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ update-context │ functional-032730 update-context --alsologtostderr -v=2                                                                                                         │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ update-context │ functional-032730 update-context --alsologtostderr -v=2                                                                                                         │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image rm kicbase/echo-server:functional-032730 --alsologtostderr                                                                              │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ update-context │ functional-032730 update-context --alsologtostderr -v=2                                                                                                         │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls                                                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls                                                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image save --daemon kicbase/echo-server:functional-032730 --alsologtostderr                                                                   │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls --format yaml --alsologtostderr                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls --format short --alsologtostderr                                                                                                     │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls --format json --alsologtostderr                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls --format table --alsologtostderr                                                                                                     │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ ssh            │ functional-032730 ssh pgrep buildkitd                                                                                                                           │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │                     │
	│ image          │ functional-032730 image build -t localhost/my-image:functional-032730 testdata/build --alsologtostderr                                                          │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls                                                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ delete         │ -p functional-032730                                                                                                                                            │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ start          │ -p functional-682596 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │                     │
	│ start          │ -p functional-682596 --alsologtostderr -v=8                                                                                                                     │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:25 UTC │                     │
	└────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/17 20:25:44
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1217 20:25:44.045489  414292 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:25:44.045686  414292 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:25:44.045714  414292 out.go:374] Setting ErrFile to fd 2...
	I1217 20:25:44.045733  414292 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:25:44.046029  414292 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:25:44.046470  414292 out.go:368] Setting JSON to false
	I1217 20:25:44.047409  414292 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":11289,"bootTime":1765991855,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:25:44.047515  414292 start.go:143] virtualization:  
	I1217 20:25:44.053027  414292 out.go:179] * [functional-682596] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1217 20:25:44.056011  414292 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 20:25:44.056093  414292 notify.go:221] Checking for updates...
	I1217 20:25:44.061883  414292 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:25:44.064833  414292 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:44.067589  414292 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:25:44.070446  414292 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 20:25:44.073380  414292 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 20:25:44.076968  414292 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:25:44.077128  414292 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:25:44.112208  414292 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:25:44.112455  414292 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:25:44.167112  414292 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 20:25:44.158029599 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:25:44.167209  414292 docker.go:319] overlay module found
	I1217 20:25:44.170171  414292 out.go:179] * Using the docker driver based on existing profile
	I1217 20:25:44.173086  414292 start.go:309] selected driver: docker
	I1217 20:25:44.173109  414292 start.go:927] validating driver "docker" against &{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableC
oreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:25:44.173214  414292 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 20:25:44.173330  414292 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:25:44.234258  414292 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 20:25:44.225129855 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:25:44.234785  414292 cni.go:84] Creating CNI manager for ""
	I1217 20:25:44.234848  414292 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:25:44.234909  414292 start.go:353] cluster config:
	{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath:
StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:25:44.238034  414292 out.go:179] * Starting "functional-682596" primary control-plane node in "functional-682596" cluster
	I1217 20:25:44.240853  414292 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1217 20:25:44.243760  414292 out.go:179] * Pulling base image v0.0.48-1765661130-22141 ...
	I1217 20:25:44.246713  414292 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:25:44.246768  414292 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1217 20:25:44.246782  414292 cache.go:65] Caching tarball of preloaded images
	I1217 20:25:44.246797  414292 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon
	I1217 20:25:44.246869  414292 preload.go:238] Found /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1217 20:25:44.246880  414292 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1217 20:25:44.246994  414292 profile.go:143] Saving config to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/config.json ...
	I1217 20:25:44.265764  414292 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon, skipping pull
	I1217 20:25:44.265789  414292 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 exists in daemon, skipping load
	I1217 20:25:44.265812  414292 cache.go:243] Successfully downloaded all kic artifacts
	I1217 20:25:44.265841  414292 start.go:360] acquireMachinesLock for functional-682596: {Name:mk49b95a4c72eb2d15a1ae0f35918a9843d0b3df Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1217 20:25:44.265903  414292 start.go:364] duration metric: took 36.013µs to acquireMachinesLock for "functional-682596"
	I1217 20:25:44.265927  414292 start.go:96] Skipping create...Using existing machine configuration
	I1217 20:25:44.265936  414292 fix.go:54] fixHost starting: 
	I1217 20:25:44.266187  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:44.282574  414292 fix.go:112] recreateIfNeeded on functional-682596: state=Running err=<nil>
	W1217 20:25:44.282603  414292 fix.go:138] unexpected machine state, will restart: <nil>
	I1217 20:25:44.285918  414292 out.go:252] * Updating the running docker "functional-682596" container ...
	I1217 20:25:44.285950  414292 machine.go:94] provisionDockerMachine start ...
	I1217 20:25:44.286031  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:44.302759  414292 main.go:143] libmachine: Using SSH client type: native
	I1217 20:25:44.303096  414292 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:25:44.303111  414292 main.go:143] libmachine: About to run SSH command:
	hostname
	I1217 20:25:44.431913  414292 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:25:44.431939  414292 ubuntu.go:182] provisioning hostname "functional-682596"
	I1217 20:25:44.432002  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:44.450770  414292 main.go:143] libmachine: Using SSH client type: native
	I1217 20:25:44.451117  414292 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:25:44.451136  414292 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-682596 && echo "functional-682596" | sudo tee /etc/hostname
	I1217 20:25:44.601580  414292 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:25:44.601732  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:44.619103  414292 main.go:143] libmachine: Using SSH client type: native
	I1217 20:25:44.619412  414292 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:25:44.619435  414292 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-682596' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-682596/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-682596' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1217 20:25:44.748545  414292 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1217 20:25:44.748571  414292 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21808-367595/.minikube CaCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21808-367595/.minikube}
	I1217 20:25:44.748593  414292 ubuntu.go:190] setting up certificates
	I1217 20:25:44.748603  414292 provision.go:84] configureAuth start
	I1217 20:25:44.748675  414292 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:25:44.766057  414292 provision.go:143] copyHostCerts
	I1217 20:25:44.766100  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem
	I1217 20:25:44.766141  414292 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem, removing ...
	I1217 20:25:44.766152  414292 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem
	I1217 20:25:44.766226  414292 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem (1082 bytes)
	I1217 20:25:44.766327  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem
	I1217 20:25:44.766347  414292 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem, removing ...
	I1217 20:25:44.766357  414292 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem
	I1217 20:25:44.766385  414292 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem (1123 bytes)
	I1217 20:25:44.766441  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem
	I1217 20:25:44.766461  414292 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem, removing ...
	I1217 20:25:44.766471  414292 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem
	I1217 20:25:44.766501  414292 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem (1679 bytes)
	I1217 20:25:44.766561  414292 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem org=jenkins.functional-682596 san=[127.0.0.1 192.168.49.2 functional-682596 localhost minikube]
	I1217 20:25:45.107844  414292 provision.go:177] copyRemoteCerts
	I1217 20:25:45.108657  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1217 20:25:45.108873  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.149674  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.277212  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1217 20:25:45.277284  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1217 20:25:45.298737  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1217 20:25:45.298796  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1217 20:25:45.320659  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1217 20:25:45.320720  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I1217 20:25:45.338755  414292 provision.go:87] duration metric: took 590.128101ms to configureAuth
	I1217 20:25:45.338800  414292 ubuntu.go:206] setting minikube options for container-runtime
	I1217 20:25:45.338978  414292 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:25:45.339040  414292 machine.go:97] duration metric: took 1.053082119s to provisionDockerMachine
	I1217 20:25:45.339048  414292 start.go:293] postStartSetup for "functional-682596" (driver="docker")
	I1217 20:25:45.339059  414292 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1217 20:25:45.339122  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1217 20:25:45.339165  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.356059  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.452345  414292 ssh_runner.go:195] Run: cat /etc/os-release
	I1217 20:25:45.455946  414292 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1217 20:25:45.455965  414292 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1217 20:25:45.455970  414292 command_runner.go:130] > VERSION_ID="12"
	I1217 20:25:45.455975  414292 command_runner.go:130] > VERSION="12 (bookworm)"
	I1217 20:25:45.455980  414292 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1217 20:25:45.455983  414292 command_runner.go:130] > ID=debian
	I1217 20:25:45.455989  414292 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1217 20:25:45.455994  414292 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1217 20:25:45.456008  414292 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1217 20:25:45.456046  414292 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1217 20:25:45.456062  414292 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1217 20:25:45.456073  414292 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/addons for local assets ...
	I1217 20:25:45.456130  414292 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/files for local assets ...
	I1217 20:25:45.456208  414292 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> 3694612.pem in /etc/ssl/certs
	I1217 20:25:45.456215  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> /etc/ssl/certs/3694612.pem
	I1217 20:25:45.456308  414292 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts -> hosts in /etc/test/nested/copy/369461
	I1217 20:25:45.456313  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts -> /etc/test/nested/copy/369461/hosts
	I1217 20:25:45.456356  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/369461
	I1217 20:25:45.464083  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:25:45.481460  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts --> /etc/test/nested/copy/369461/hosts (40 bytes)
	I1217 20:25:45.500420  414292 start.go:296] duration metric: took 161.357637ms for postStartSetup
	I1217 20:25:45.500542  414292 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1217 20:25:45.500615  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.517677  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.609195  414292 command_runner.go:130] > 18%
	I1217 20:25:45.609800  414292 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1217 20:25:45.614741  414292 command_runner.go:130] > 159G
	I1217 20:25:45.614774  414292 fix.go:56] duration metric: took 1.348835133s for fixHost
	I1217 20:25:45.614785  414292 start.go:83] releasing machines lock for "functional-682596", held for 1.348870218s
	I1217 20:25:45.614866  414292 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:25:45.631621  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:25:45.631685  414292 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:25:45.631702  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:25:45.631735  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:25:45.631767  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:25:45.631798  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:25:45.631848  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:25:45.631888  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.631907  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.631926  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem -> /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.631943  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:25:45.631995  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.649517  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.754346  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:25:45.772163  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:25:45.789636  414292 ssh_runner.go:195] Run: openssl version
	I1217 20:25:45.795706  414292 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1217 20:25:45.796203  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.803937  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:25:45.811516  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.815311  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.815389  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.815474  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.856132  414292 command_runner.go:130] > 3ec20f2e
	I1217 20:25:45.856705  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:25:45.864064  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.871519  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:25:45.879293  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.883196  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.883238  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.883306  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.924322  414292 command_runner.go:130] > b5213941
	I1217 20:25:45.924802  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:25:45.932259  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.939603  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:25:45.947311  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.950955  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.951320  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.951411  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.993968  414292 command_runner.go:130] > 51391683
	I1217 20:25:45.994167  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:25:46.002855  414292 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-certificates >/dev/null 2>&1 && sudo update-ca-certificates || true"
	I1217 20:25:46.007551  414292 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-trust >/dev/null 2>&1 && sudo update-ca-trust extract || true"
	I1217 20:25:46.011748  414292 ssh_runner.go:195] Run: cat /version.json
	I1217 20:25:46.011837  414292 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1217 20:25:46.016112  414292 command_runner.go:130] > {"iso_version": "v1.37.0-1765579389-22117", "kicbase_version": "v0.0.48-1765661130-22141", "minikube_version": "v1.37.0", "commit": "cbb33128a244032d08f8fc6e6c9f03b30f0da3e4"}
	I1217 20:25:46.018576  414292 ssh_runner.go:195] Run: systemctl --version
	I1217 20:25:46.126907  414292 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1217 20:25:46.127016  414292 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1217 20:25:46.127060  414292 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1217 20:25:46.127172  414292 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1217 20:25:46.131726  414292 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1217 20:25:46.131887  414292 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1217 20:25:46.131965  414292 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1217 20:25:46.140024  414292 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1217 20:25:46.140047  414292 start.go:496] detecting cgroup driver to use...
	I1217 20:25:46.140078  414292 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1217 20:25:46.140156  414292 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1217 20:25:46.155753  414292 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1217 20:25:46.168916  414292 docker.go:218] disabling cri-docker service (if available) ...
	I1217 20:25:46.169009  414292 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1217 20:25:46.184457  414292 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1217 20:25:46.197441  414292 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1217 20:25:46.302684  414292 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1217 20:25:46.421553  414292 docker.go:234] disabling docker service ...
	I1217 20:25:46.421621  414292 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1217 20:25:46.436823  414292 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1217 20:25:46.449890  414292 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1217 20:25:46.565021  414292 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1217 20:25:46.678341  414292 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1217 20:25:46.693104  414292 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1217 20:25:46.705993  414292 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1217 20:25:46.707385  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1217 20:25:46.716410  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1217 20:25:46.724756  414292 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1217 20:25:46.724876  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1217 20:25:46.733647  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:25:46.742030  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1217 20:25:46.750673  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:25:46.759312  414292 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1217 20:25:46.768595  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1217 20:25:46.777345  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1217 20:25:46.786196  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1217 20:25:46.795479  414292 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1217 20:25:46.802392  414292 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1217 20:25:46.803423  414292 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1217 20:25:46.811004  414292 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:25:46.926090  414292 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1217 20:25:47.068989  414292 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1217 20:25:47.069169  414292 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1217 20:25:47.073250  414292 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1217 20:25:47.073355  414292 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1217 20:25:47.073385  414292 command_runner.go:130] > Device: 0,72	Inode: 1618        Links: 1
	I1217 20:25:47.073441  414292 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1217 20:25:47.073470  414292 command_runner.go:130] > Access: 2025-12-17 20:25:47.016473578 +0000
	I1217 20:25:47.073512  414292 command_runner.go:130] > Modify: 2025-12-17 20:25:47.016473578 +0000
	I1217 20:25:47.073542  414292 command_runner.go:130] > Change: 2025-12-17 20:25:47.016473578 +0000
	I1217 20:25:47.073561  414292 command_runner.go:130] >  Birth: -
	I1217 20:25:47.073923  414292 start.go:564] Will wait 60s for crictl version
	I1217 20:25:47.074046  414292 ssh_runner.go:195] Run: which crictl
	I1217 20:25:47.077775  414292 command_runner.go:130] > /usr/local/bin/crictl
	I1217 20:25:47.078218  414292 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1217 20:25:47.104139  414292 command_runner.go:130] > Version:  0.1.0
	I1217 20:25:47.104225  414292 command_runner.go:130] > RuntimeName:  containerd
	I1217 20:25:47.104269  414292 command_runner.go:130] > RuntimeVersion:  v2.2.0
	I1217 20:25:47.104295  414292 command_runner.go:130] > RuntimeApiVersion:  v1
	I1217 20:25:47.106475  414292 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1217 20:25:47.106628  414292 ssh_runner.go:195] Run: containerd --version
	I1217 20:25:47.130403  414292 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1217 20:25:47.132698  414292 ssh_runner.go:195] Run: containerd --version
	I1217 20:25:47.152199  414292 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1217 20:25:47.159813  414292 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1217 20:25:47.162759  414292 cli_runner.go:164] Run: docker network inspect functional-682596 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1217 20:25:47.179237  414292 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1217 20:25:47.183476  414292 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1217 20:25:47.183701  414292 kubeadm.go:884] updating cluster {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false C
ustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1217 20:25:47.183825  414292 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:25:47.183890  414292 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:25:47.207538  414292 command_runner.go:130] > {
	I1217 20:25:47.207560  414292 command_runner.go:130] >   "images":  [
	I1217 20:25:47.207564  414292 command_runner.go:130] >     {
	I1217 20:25:47.207574  414292 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1217 20:25:47.207582  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207588  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1217 20:25:47.207591  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207595  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207607  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1217 20:25:47.207614  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207618  414292 command_runner.go:130] >       "size":  "40636774",
	I1217 20:25:47.207625  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207630  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207636  414292 command_runner.go:130] >     },
	I1217 20:25:47.207639  414292 command_runner.go:130] >     {
	I1217 20:25:47.207647  414292 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1217 20:25:47.207655  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207660  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1217 20:25:47.207664  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207668  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207678  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1217 20:25:47.207684  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207688  414292 command_runner.go:130] >       "size":  "8034419",
	I1217 20:25:47.207692  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207696  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207698  414292 command_runner.go:130] >     },
	I1217 20:25:47.207702  414292 command_runner.go:130] >     {
	I1217 20:25:47.207709  414292 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1217 20:25:47.207715  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207720  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1217 20:25:47.207735  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207747  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207756  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1217 20:25:47.207759  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207763  414292 command_runner.go:130] >       "size":  "21168808",
	I1217 20:25:47.207766  414292 command_runner.go:130] >       "username":  "nonroot",
	I1217 20:25:47.207770  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207773  414292 command_runner.go:130] >     },
	I1217 20:25:47.207776  414292 command_runner.go:130] >     {
	I1217 20:25:47.207783  414292 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1217 20:25:47.207787  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207791  414292 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1217 20:25:47.207795  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207798  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207806  414292 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1217 20:25:47.207809  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207813  414292 command_runner.go:130] >       "size":  "21749640",
	I1217 20:25:47.207817  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.207822  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.207826  414292 command_runner.go:130] >       },
	I1217 20:25:47.207833  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207837  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207842  414292 command_runner.go:130] >     },
	I1217 20:25:47.207846  414292 command_runner.go:130] >     {
	I1217 20:25:47.207853  414292 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1217 20:25:47.207859  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207865  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1217 20:25:47.207867  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207872  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207886  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1217 20:25:47.207890  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207894  414292 command_runner.go:130] >       "size":  "24692223",
	I1217 20:25:47.207897  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.207906  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.207915  414292 command_runner.go:130] >       },
	I1217 20:25:47.207928  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207932  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207934  414292 command_runner.go:130] >     },
	I1217 20:25:47.207938  414292 command_runner.go:130] >     {
	I1217 20:25:47.207947  414292 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1217 20:25:47.207955  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207961  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1217 20:25:47.207964  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207968  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207976  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1217 20:25:47.207982  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207986  414292 command_runner.go:130] >       "size":  "20672157",
	I1217 20:25:47.207990  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.207997  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.208001  414292 command_runner.go:130] >       },
	I1217 20:25:47.208020  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208028  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.208032  414292 command_runner.go:130] >     },
	I1217 20:25:47.208035  414292 command_runner.go:130] >     {
	I1217 20:25:47.208042  414292 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1217 20:25:47.208049  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.208054  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1217 20:25:47.208058  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208062  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.208069  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1217 20:25:47.208074  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208079  414292 command_runner.go:130] >       "size":  "22432301",
	I1217 20:25:47.208082  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208088  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.208091  414292 command_runner.go:130] >     },
	I1217 20:25:47.208097  414292 command_runner.go:130] >     {
	I1217 20:25:47.208104  414292 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1217 20:25:47.208114  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.208120  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1217 20:25:47.208123  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208128  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.208142  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1217 20:25:47.208146  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208149  414292 command_runner.go:130] >       "size":  "15405535",
	I1217 20:25:47.208153  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.208157  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.208163  414292 command_runner.go:130] >       },
	I1217 20:25:47.208168  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208173  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.208177  414292 command_runner.go:130] >     },
	I1217 20:25:47.208183  414292 command_runner.go:130] >     {
	I1217 20:25:47.208189  414292 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1217 20:25:47.208195  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.208200  414292 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1217 20:25:47.208203  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208207  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.208215  414292 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1217 20:25:47.208221  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208225  414292 command_runner.go:130] >       "size":  "267939",
	I1217 20:25:47.208229  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.208233  414292 command_runner.go:130] >         "value":  "65535"
	I1217 20:25:47.208237  414292 command_runner.go:130] >       },
	I1217 20:25:47.208240  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208245  414292 command_runner.go:130] >       "pinned":  true
	I1217 20:25:47.208339  414292 command_runner.go:130] >     }
	I1217 20:25:47.208342  414292 command_runner.go:130] >   ]
	I1217 20:25:47.208344  414292 command_runner.go:130] > }
	I1217 20:25:47.208525  414292 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:25:47.208539  414292 containerd.go:534] Images already preloaded, skipping extraction
	I1217 20:25:47.208601  414292 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:25:47.230634  414292 command_runner.go:130] > {
	I1217 20:25:47.230653  414292 command_runner.go:130] >   "images":  [
	I1217 20:25:47.230659  414292 command_runner.go:130] >     {
	I1217 20:25:47.230668  414292 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1217 20:25:47.230673  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230679  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1217 20:25:47.230683  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230687  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230696  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1217 20:25:47.230703  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230721  414292 command_runner.go:130] >       "size":  "40636774",
	I1217 20:25:47.230725  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.230729  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.230735  414292 command_runner.go:130] >     },
	I1217 20:25:47.230741  414292 command_runner.go:130] >     {
	I1217 20:25:47.230756  414292 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1217 20:25:47.230764  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230769  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1217 20:25:47.230773  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230786  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230798  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1217 20:25:47.230801  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230812  414292 command_runner.go:130] >       "size":  "8034419",
	I1217 20:25:47.230816  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.230819  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.230823  414292 command_runner.go:130] >     },
	I1217 20:25:47.230826  414292 command_runner.go:130] >     {
	I1217 20:25:47.230833  414292 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1217 20:25:47.230839  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230844  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1217 20:25:47.230857  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230888  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230900  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1217 20:25:47.230911  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230916  414292 command_runner.go:130] >       "size":  "21168808",
	I1217 20:25:47.230923  414292 command_runner.go:130] >       "username":  "nonroot",
	I1217 20:25:47.230927  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.230936  414292 command_runner.go:130] >     },
	I1217 20:25:47.230939  414292 command_runner.go:130] >     {
	I1217 20:25:47.230946  414292 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1217 20:25:47.230950  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230954  414292 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1217 20:25:47.230960  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230964  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230972  414292 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1217 20:25:47.230984  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230988  414292 command_runner.go:130] >       "size":  "21749640",
	I1217 20:25:47.230991  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.230995  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.230998  414292 command_runner.go:130] >       },
	I1217 20:25:47.231003  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231009  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231012  414292 command_runner.go:130] >     },
	I1217 20:25:47.231018  414292 command_runner.go:130] >     {
	I1217 20:25:47.231024  414292 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1217 20:25:47.231037  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231042  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1217 20:25:47.231045  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231050  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231063  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1217 20:25:47.231067  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231071  414292 command_runner.go:130] >       "size":  "24692223",
	I1217 20:25:47.231074  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231087  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.231093  414292 command_runner.go:130] >       },
	I1217 20:25:47.231097  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231111  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231117  414292 command_runner.go:130] >     },
	I1217 20:25:47.231125  414292 command_runner.go:130] >     {
	I1217 20:25:47.231132  414292 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1217 20:25:47.231138  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231144  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1217 20:25:47.231151  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231155  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231164  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1217 20:25:47.231168  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231172  414292 command_runner.go:130] >       "size":  "20672157",
	I1217 20:25:47.231178  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231194  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.231200  414292 command_runner.go:130] >       },
	I1217 20:25:47.231204  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231208  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231211  414292 command_runner.go:130] >     },
	I1217 20:25:47.231214  414292 command_runner.go:130] >     {
	I1217 20:25:47.231223  414292 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1217 20:25:47.231238  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231246  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1217 20:25:47.231250  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231254  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231264  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1217 20:25:47.231276  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231280  414292 command_runner.go:130] >       "size":  "22432301",
	I1217 20:25:47.231284  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231288  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231291  414292 command_runner.go:130] >     },
	I1217 20:25:47.231294  414292 command_runner.go:130] >     {
	I1217 20:25:47.231309  414292 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1217 20:25:47.231317  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231323  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1217 20:25:47.231333  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231337  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231347  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1217 20:25:47.231359  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231363  414292 command_runner.go:130] >       "size":  "15405535",
	I1217 20:25:47.231366  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231370  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.231373  414292 command_runner.go:130] >       },
	I1217 20:25:47.231379  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231392  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231395  414292 command_runner.go:130] >     },
	I1217 20:25:47.231405  414292 command_runner.go:130] >     {
	I1217 20:25:47.231412  414292 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1217 20:25:47.231418  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231423  414292 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1217 20:25:47.231428  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231437  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231445  414292 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1217 20:25:47.231448  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231452  414292 command_runner.go:130] >       "size":  "267939",
	I1217 20:25:47.231455  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231459  414292 command_runner.go:130] >         "value":  "65535"
	I1217 20:25:47.231462  414292 command_runner.go:130] >       },
	I1217 20:25:47.231466  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231469  414292 command_runner.go:130] >       "pinned":  true
	I1217 20:25:47.231473  414292 command_runner.go:130] >     }
	I1217 20:25:47.231479  414292 command_runner.go:130] >   ]
	I1217 20:25:47.231482  414292 command_runner.go:130] > }
	I1217 20:25:47.233897  414292 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:25:47.233919  414292 cache_images.go:86] Images are preloaded, skipping loading
	I1217 20:25:47.233928  414292 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1217 20:25:47.234041  414292 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-682596 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1217 20:25:47.234107  414292 ssh_runner.go:195] Run: sudo crictl info
	I1217 20:25:47.256786  414292 command_runner.go:130] > {
	I1217 20:25:47.256808  414292 command_runner.go:130] >   "cniconfig": {
	I1217 20:25:47.256814  414292 command_runner.go:130] >     "Networks": [
	I1217 20:25:47.256818  414292 command_runner.go:130] >       {
	I1217 20:25:47.256823  414292 command_runner.go:130] >         "Config": {
	I1217 20:25:47.256827  414292 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1217 20:25:47.256833  414292 command_runner.go:130] >           "Name": "cni-loopback",
	I1217 20:25:47.256837  414292 command_runner.go:130] >           "Plugins": [
	I1217 20:25:47.256840  414292 command_runner.go:130] >             {
	I1217 20:25:47.256846  414292 command_runner.go:130] >               "Network": {
	I1217 20:25:47.256851  414292 command_runner.go:130] >                 "ipam": {},
	I1217 20:25:47.256863  414292 command_runner.go:130] >                 "type": "loopback"
	I1217 20:25:47.256875  414292 command_runner.go:130] >               },
	I1217 20:25:47.256880  414292 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1217 20:25:47.256883  414292 command_runner.go:130] >             }
	I1217 20:25:47.256887  414292 command_runner.go:130] >           ],
	I1217 20:25:47.256896  414292 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1217 20:25:47.256900  414292 command_runner.go:130] >         },
	I1217 20:25:47.256911  414292 command_runner.go:130] >         "IFName": "lo"
	I1217 20:25:47.256917  414292 command_runner.go:130] >       }
	I1217 20:25:47.256920  414292 command_runner.go:130] >     ],
	I1217 20:25:47.256924  414292 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1217 20:25:47.256927  414292 command_runner.go:130] >     "PluginDirs": [
	I1217 20:25:47.256932  414292 command_runner.go:130] >       "/opt/cni/bin"
	I1217 20:25:47.256941  414292 command_runner.go:130] >     ],
	I1217 20:25:47.256945  414292 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1217 20:25:47.256949  414292 command_runner.go:130] >     "Prefix": "eth"
	I1217 20:25:47.256952  414292 command_runner.go:130] >   },
	I1217 20:25:47.256957  414292 command_runner.go:130] >   "config": {
	I1217 20:25:47.256962  414292 command_runner.go:130] >     "cdiSpecDirs": [
	I1217 20:25:47.256965  414292 command_runner.go:130] >       "/etc/cdi",
	I1217 20:25:47.256969  414292 command_runner.go:130] >       "/var/run/cdi"
	I1217 20:25:47.256977  414292 command_runner.go:130] >     ],
	I1217 20:25:47.256985  414292 command_runner.go:130] >     "cni": {
	I1217 20:25:47.256991  414292 command_runner.go:130] >       "binDir": "",
	I1217 20:25:47.256995  414292 command_runner.go:130] >       "binDirs": [
	I1217 20:25:47.256999  414292 command_runner.go:130] >         "/opt/cni/bin"
	I1217 20:25:47.257003  414292 command_runner.go:130] >       ],
	I1217 20:25:47.257008  414292 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1217 20:25:47.257025  414292 command_runner.go:130] >       "confTemplate": "",
	I1217 20:25:47.257029  414292 command_runner.go:130] >       "ipPref": "",
	I1217 20:25:47.257033  414292 command_runner.go:130] >       "maxConfNum": 1,
	I1217 20:25:47.257040  414292 command_runner.go:130] >       "setupSerially": false,
	I1217 20:25:47.257044  414292 command_runner.go:130] >       "useInternalLoopback": false
	I1217 20:25:47.257049  414292 command_runner.go:130] >     },
	I1217 20:25:47.257057  414292 command_runner.go:130] >     "containerd": {
	I1217 20:25:47.257061  414292 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1217 20:25:47.257069  414292 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1217 20:25:47.257076  414292 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1217 20:25:47.257080  414292 command_runner.go:130] >       "runtimes": {
	I1217 20:25:47.257084  414292 command_runner.go:130] >         "runc": {
	I1217 20:25:47.257097  414292 command_runner.go:130] >           "ContainerAnnotations": null,
	I1217 20:25:47.257102  414292 command_runner.go:130] >           "PodAnnotations": null,
	I1217 20:25:47.257106  414292 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1217 20:25:47.257111  414292 command_runner.go:130] >           "cgroupWritable": false,
	I1217 20:25:47.257119  414292 command_runner.go:130] >           "cniConfDir": "",
	I1217 20:25:47.257123  414292 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1217 20:25:47.257127  414292 command_runner.go:130] >           "io_type": "",
	I1217 20:25:47.257133  414292 command_runner.go:130] >           "options": {
	I1217 20:25:47.257139  414292 command_runner.go:130] >             "BinaryName": "",
	I1217 20:25:47.257143  414292 command_runner.go:130] >             "CriuImagePath": "",
	I1217 20:25:47.257148  414292 command_runner.go:130] >             "CriuWorkPath": "",
	I1217 20:25:47.257154  414292 command_runner.go:130] >             "IoGid": 0,
	I1217 20:25:47.257158  414292 command_runner.go:130] >             "IoUid": 0,
	I1217 20:25:47.257162  414292 command_runner.go:130] >             "NoNewKeyring": false,
	I1217 20:25:47.257174  414292 command_runner.go:130] >             "Root": "",
	I1217 20:25:47.257186  414292 command_runner.go:130] >             "ShimCgroup": "",
	I1217 20:25:47.257193  414292 command_runner.go:130] >             "SystemdCgroup": false
	I1217 20:25:47.257196  414292 command_runner.go:130] >           },
	I1217 20:25:47.257206  414292 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1217 20:25:47.257213  414292 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1217 20:25:47.257217  414292 command_runner.go:130] >           "runtimePath": "",
	I1217 20:25:47.257224  414292 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1217 20:25:47.257229  414292 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1217 20:25:47.257233  414292 command_runner.go:130] >           "snapshotter": ""
	I1217 20:25:47.257238  414292 command_runner.go:130] >         }
	I1217 20:25:47.257241  414292 command_runner.go:130] >       }
	I1217 20:25:47.257246  414292 command_runner.go:130] >     },
	I1217 20:25:47.257261  414292 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1217 20:25:47.257269  414292 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1217 20:25:47.257274  414292 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1217 20:25:47.257280  414292 command_runner.go:130] >     "disableApparmor": false,
	I1217 20:25:47.257290  414292 command_runner.go:130] >     "disableHugetlbController": true,
	I1217 20:25:47.257294  414292 command_runner.go:130] >     "disableProcMount": false,
	I1217 20:25:47.257299  414292 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1217 20:25:47.257303  414292 command_runner.go:130] >     "enableCDI": true,
	I1217 20:25:47.257309  414292 command_runner.go:130] >     "enableSelinux": false,
	I1217 20:25:47.257313  414292 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1217 20:25:47.257318  414292 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1217 20:25:47.257325  414292 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1217 20:25:47.257331  414292 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1217 20:25:47.257336  414292 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1217 20:25:47.257340  414292 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1217 20:25:47.257353  414292 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1217 20:25:47.257358  414292 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1217 20:25:47.257362  414292 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1217 20:25:47.257368  414292 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1217 20:25:47.257375  414292 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1217 20:25:47.257379  414292 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1217 20:25:47.257386  414292 command_runner.go:130] >   },
	I1217 20:25:47.257390  414292 command_runner.go:130] >   "features": {
	I1217 20:25:47.257396  414292 command_runner.go:130] >     "supplemental_groups_policy": true
	I1217 20:25:47.257399  414292 command_runner.go:130] >   },
	I1217 20:25:47.257403  414292 command_runner.go:130] >   "golang": "go1.24.9",
	I1217 20:25:47.257416  414292 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1217 20:25:47.257429  414292 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1217 20:25:47.257433  414292 command_runner.go:130] >   "runtimeHandlers": [
	I1217 20:25:47.257436  414292 command_runner.go:130] >     {
	I1217 20:25:47.257447  414292 command_runner.go:130] >       "features": {
	I1217 20:25:47.257451  414292 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1217 20:25:47.257455  414292 command_runner.go:130] >         "user_namespaces": true
	I1217 20:25:47.257460  414292 command_runner.go:130] >       }
	I1217 20:25:47.257463  414292 command_runner.go:130] >     },
	I1217 20:25:47.257469  414292 command_runner.go:130] >     {
	I1217 20:25:47.257473  414292 command_runner.go:130] >       "features": {
	I1217 20:25:47.257477  414292 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1217 20:25:47.257481  414292 command_runner.go:130] >         "user_namespaces": true
	I1217 20:25:47.257484  414292 command_runner.go:130] >       },
	I1217 20:25:47.257488  414292 command_runner.go:130] >       "name": "runc"
	I1217 20:25:47.257494  414292 command_runner.go:130] >     }
	I1217 20:25:47.257497  414292 command_runner.go:130] >   ],
	I1217 20:25:47.257502  414292 command_runner.go:130] >   "status": {
	I1217 20:25:47.257506  414292 command_runner.go:130] >     "conditions": [
	I1217 20:25:47.257509  414292 command_runner.go:130] >       {
	I1217 20:25:47.257514  414292 command_runner.go:130] >         "message": "",
	I1217 20:25:47.257526  414292 command_runner.go:130] >         "reason": "",
	I1217 20:25:47.257530  414292 command_runner.go:130] >         "status": true,
	I1217 20:25:47.257536  414292 command_runner.go:130] >         "type": "RuntimeReady"
	I1217 20:25:47.257539  414292 command_runner.go:130] >       },
	I1217 20:25:47.257543  414292 command_runner.go:130] >       {
	I1217 20:25:47.257549  414292 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1217 20:25:47.257554  414292 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1217 20:25:47.257563  414292 command_runner.go:130] >         "status": false,
	I1217 20:25:47.257568  414292 command_runner.go:130] >         "type": "NetworkReady"
	I1217 20:25:47.257574  414292 command_runner.go:130] >       },
	I1217 20:25:47.257577  414292 command_runner.go:130] >       {
	I1217 20:25:47.257599  414292 command_runner.go:130] >         "message": "{\"io.containerd.deprecation/cgroup-v1\":\"The support for cgroup v1 is deprecated since containerd v2.2 and will be removed by no later than May 2029. Upgrade the host to use cgroup v2.\"}",
	I1217 20:25:47.257609  414292 command_runner.go:130] >         "reason": "ContainerdHasDeprecationWarnings",
	I1217 20:25:47.257615  414292 command_runner.go:130] >         "status": false,
	I1217 20:25:47.257620  414292 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1217 20:25:47.257626  414292 command_runner.go:130] >       }
	I1217 20:25:47.257629  414292 command_runner.go:130] >     ]
	I1217 20:25:47.257631  414292 command_runner.go:130] >   }
	I1217 20:25:47.257634  414292 command_runner.go:130] > }
	I1217 20:25:47.259959  414292 cni.go:84] Creating CNI manager for ""
	I1217 20:25:47.259981  414292 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:25:47.259991  414292 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1217 20:25:47.260020  414292 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-682596 NodeName:functional-682596 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Stati
cPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1217 20:25:47.260142  414292 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-682596"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1217 20:25:47.260216  414292 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1217 20:25:47.267498  414292 command_runner.go:130] > kubeadm
	I1217 20:25:47.267517  414292 command_runner.go:130] > kubectl
	I1217 20:25:47.267520  414292 command_runner.go:130] > kubelet
	I1217 20:25:47.268462  414292 binaries.go:51] Found k8s binaries, skipping transfer
	I1217 20:25:47.268563  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1217 20:25:47.276438  414292 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1217 20:25:47.289778  414292 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1217 20:25:47.303155  414292 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2235 bytes)
	I1217 20:25:47.315864  414292 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1217 20:25:47.319319  414292 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1217 20:25:47.319605  414292 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:25:47.441462  414292 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1217 20:25:47.463080  414292 certs.go:69] Setting up /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596 for IP: 192.168.49.2
	I1217 20:25:47.463150  414292 certs.go:195] generating shared ca certs ...
	I1217 20:25:47.463190  414292 certs.go:227] acquiring lock for ca certs: {Name:mk528c7ee25f2f3d78de33f266a77f908cb2a9d0 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:47.463362  414292 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key
	I1217 20:25:47.463461  414292 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key
	I1217 20:25:47.463501  414292 certs.go:257] generating profile certs ...
	I1217 20:25:47.463662  414292 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key
	I1217 20:25:47.463774  414292 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key.0c30bf8d
	I1217 20:25:47.463860  414292 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key
	I1217 20:25:47.463894  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1217 20:25:47.463938  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1217 20:25:47.463977  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1217 20:25:47.464005  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1217 20:25:47.464049  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1217 20:25:47.464079  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1217 20:25:47.464117  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1217 20:25:47.464151  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1217 20:25:47.464241  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:25:47.464342  414292 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:25:47.464377  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:25:47.464421  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:25:47.464488  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:25:47.464541  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:25:47.464629  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:25:47.464693  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.464733  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.464771  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem -> /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.469220  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1217 20:25:47.495389  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1217 20:25:47.516308  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1217 20:25:47.535144  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1217 20:25:47.552466  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1217 20:25:47.570909  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1217 20:25:47.588173  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1217 20:25:47.606011  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1217 20:25:47.623433  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:25:47.640520  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:25:47.657751  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:25:47.675695  414292 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1217 20:25:47.688487  414292 ssh_runner.go:195] Run: openssl version
	I1217 20:25:47.694560  414292 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1217 20:25:47.694946  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.702368  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:25:47.710124  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.713826  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.713858  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.713917  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.754917  414292 command_runner.go:130] > 3ec20f2e
	I1217 20:25:47.755445  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:25:47.763008  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.770327  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:25:47.778030  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.782014  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.782042  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.782099  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.822920  414292 command_runner.go:130] > b5213941
	I1217 20:25:47.823058  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:25:47.830582  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.837906  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:25:47.845640  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.849463  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.849531  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.849600  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.890040  414292 command_runner.go:130] > 51391683
	I1217 20:25:47.890555  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:25:47.898150  414292 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1217 20:25:47.901790  414292 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1217 20:25:47.901872  414292 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1217 20:25:47.901887  414292 command_runner.go:130] > Device: 259,1	Inode: 1060771     Links: 1
	I1217 20:25:47.901895  414292 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1217 20:25:47.901902  414292 command_runner.go:130] > Access: 2025-12-17 20:21:41.033930957 +0000
	I1217 20:25:47.901907  414292 command_runner.go:130] > Modify: 2025-12-17 20:17:35.731490416 +0000
	I1217 20:25:47.901912  414292 command_runner.go:130] > Change: 2025-12-17 20:17:35.731490416 +0000
	I1217 20:25:47.901921  414292 command_runner.go:130] >  Birth: 2025-12-17 20:17:35.731490416 +0000
	I1217 20:25:47.901988  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1217 20:25:47.942293  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:47.942780  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1217 20:25:47.983019  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:47.983513  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1217 20:25:48.024341  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.024837  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1217 20:25:48.065771  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.066190  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1217 20:25:48.107223  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.107692  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1217 20:25:48.148374  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.148810  414292 kubeadm.go:401] StartCluster: {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Cust
omQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:25:48.148912  414292 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1217 20:25:48.148983  414292 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1217 20:25:48.175983  414292 cri.go:89] found id: ""
	I1217 20:25:48.176056  414292 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1217 20:25:48.182939  414292 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1217 20:25:48.182960  414292 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1217 20:25:48.182967  414292 command_runner.go:130] > /var/lib/minikube/etcd:
	I1217 20:25:48.183854  414292 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1217 20:25:48.183910  414292 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1217 20:25:48.183977  414292 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1217 20:25:48.191197  414292 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:25:48.191635  414292 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-682596" does not appear in /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.191740  414292 kubeconfig.go:62] /home/jenkins/minikube-integration/21808-367595/kubeconfig needs updating (will repair): [kubeconfig missing "functional-682596" cluster setting kubeconfig missing "functional-682596" context setting]
	I1217 20:25:48.192034  414292 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/kubeconfig: {Name:mk68b516071fc5d9da0842bf56ff4d318cea3c03 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:48.192565  414292 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.192744  414292 kapi.go:59] client config for functional-682596: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt", KeyFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key", CAFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil),
NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1217 20:25:48.193250  414292 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1217 20:25:48.193273  414292 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1217 20:25:48.193281  414292 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1217 20:25:48.193286  414292 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1217 20:25:48.193293  414292 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1217 20:25:48.193576  414292 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1217 20:25:48.193650  414292 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1217 20:25:48.201269  414292 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1217 20:25:48.201338  414292 kubeadm.go:602] duration metric: took 17.417602ms to restartPrimaryControlPlane
	I1217 20:25:48.201355  414292 kubeadm.go:403] duration metric: took 52.552362ms to StartCluster
	I1217 20:25:48.201370  414292 settings.go:142] acquiring lock: {Name:mkec67bf414aabef990098a6cc4910956f0d3622 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:48.201429  414292 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.202007  414292 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/kubeconfig: {Name:mk68b516071fc5d9da0842bf56ff4d318cea3c03 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:48.202208  414292 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1217 20:25:48.202539  414292 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:25:48.202581  414292 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1217 20:25:48.202699  414292 addons.go:70] Setting storage-provisioner=true in profile "functional-682596"
	I1217 20:25:48.202717  414292 addons.go:239] Setting addon storage-provisioner=true in "functional-682596"
	I1217 20:25:48.202742  414292 host.go:66] Checking if "functional-682596" exists ...
	I1217 20:25:48.202770  414292 addons.go:70] Setting default-storageclass=true in profile "functional-682596"
	I1217 20:25:48.202806  414292 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-682596"
	I1217 20:25:48.203165  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:48.203224  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:48.208687  414292 out.go:179] * Verifying Kubernetes components...
	I1217 20:25:48.211692  414292 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:25:48.230383  414292 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1217 20:25:48.233339  414292 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:48.233361  414292 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1217 20:25:48.233423  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:48.236813  414292 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.236975  414292 kapi.go:59] client config for functional-682596: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt", KeyFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key", CAFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil),
NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1217 20:25:48.237238  414292 addons.go:239] Setting addon default-storageclass=true in "functional-682596"
	I1217 20:25:48.237267  414292 host.go:66] Checking if "functional-682596" exists ...
	I1217 20:25:48.237711  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:48.262897  414292 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:48.262919  414292 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1217 20:25:48.262996  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:48.269972  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:48.294767  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:48.418586  414292 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1217 20:25:48.450623  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:48.465245  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:49.239916  414292 node_ready.go:35] waiting up to 6m0s for node "functional-682596" to be "Ready" ...
	I1217 20:25:49.240030  414292 type.go:168] "Request Body" body=""
	I1217 20:25:49.240095  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:49.240342  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.240376  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240403  414292 retry.go:31] will retry after 252.350229ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240440  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.240459  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240479  414292 retry.go:31] will retry after 321.821783ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240555  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:49.493033  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:49.547929  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.551638  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.551667  414292 retry.go:31] will retry after 328.531722ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.562869  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:49.621023  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.625124  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.625209  414292 retry.go:31] will retry after 442.103425ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.740481  414292 type.go:168] "Request Body" body=""
	I1217 20:25:49.740559  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:49.740872  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:49.881274  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:49.942102  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.945784  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.945890  414292 retry.go:31] will retry after 409.243705ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.068055  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:50.127397  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:50.131721  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.131759  414292 retry.go:31] will retry after 566.560423ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.241000  414292 type.go:168] "Request Body" body=""
	I1217 20:25:50.241077  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:50.241406  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:50.355732  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:50.414970  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:50.419857  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.419893  414292 retry.go:31] will retry after 763.212709ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.699479  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:50.741041  414292 type.go:168] "Request Body" body=""
	I1217 20:25:50.741134  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:50.741465  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:50.776772  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:50.776815  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.776839  414292 retry.go:31] will retry after 1.24877806s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:51.183473  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:51.240182  414292 type.go:168] "Request Body" body=""
	I1217 20:25:51.240277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:51.240545  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:51.240594  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:51.251909  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:51.255943  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:51.255983  414292 retry.go:31] will retry after 1.271740821s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:51.740532  414292 type.go:168] "Request Body" body=""
	I1217 20:25:51.740649  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:51.740974  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:52.026483  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:52.095052  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:52.095119  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.095140  414292 retry.go:31] will retry after 1.58694383s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.240356  414292 type.go:168] "Request Body" body=""
	I1217 20:25:52.240430  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:52.240682  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:52.528382  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:52.586445  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:52.590032  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.590066  414292 retry.go:31] will retry after 1.445188932s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.740386  414292 type.go:168] "Request Body" body=""
	I1217 20:25:52.740463  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:52.740818  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:53.240182  414292 type.go:168] "Request Body" body=""
	I1217 20:25:53.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:53.240604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:53.240660  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:53.682297  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:53.740043  414292 type.go:168] "Request Body" body=""
	I1217 20:25:53.740108  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:53.740352  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:53.743851  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:53.743882  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:53.743900  414292 retry.go:31] will retry after 2.69671946s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:54.036496  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:54.096053  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:54.096099  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:54.096122  414292 retry.go:31] will retry after 2.925706415s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:54.240487  414292 type.go:168] "Request Body" body=""
	I1217 20:25:54.240571  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:54.240903  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:54.740656  414292 type.go:168] "Request Body" body=""
	I1217 20:25:54.740752  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:54.741104  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:55.240849  414292 type.go:168] "Request Body" body=""
	I1217 20:25:55.240918  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:55.241169  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:55.241222  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:55.741059  414292 type.go:168] "Request Body" body=""
	I1217 20:25:55.741137  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:55.741444  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:56.240196  414292 type.go:168] "Request Body" body=""
	I1217 20:25:56.240318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:56.240645  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:56.440979  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:56.500702  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:56.500749  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:56.500767  414292 retry.go:31] will retry after 1.84810195s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:56.740117  414292 type.go:168] "Request Body" body=""
	I1217 20:25:56.740201  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:56.740503  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:57.023057  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:57.082954  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:57.083001  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:57.083020  414292 retry.go:31] will retry after 3.223759279s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:57.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:25:57.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:57.240558  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:57.740268  414292 type.go:168] "Request Body" body=""
	I1217 20:25:57.740347  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:57.740685  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:57.740756  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:58.240571  414292 type.go:168] "Request Body" body=""
	I1217 20:25:58.240660  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:58.240952  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:58.349268  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:58.403710  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:58.407286  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:58.407317  414292 retry.go:31] will retry after 3.305771044s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:58.740858  414292 type.go:168] "Request Body" body=""
	I1217 20:25:58.740936  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:58.741275  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:59.240111  414292 type.go:168] "Request Body" body=""
	I1217 20:25:59.240220  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:59.240560  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:59.740145  414292 type.go:168] "Request Body" body=""
	I1217 20:25:59.740223  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:59.740492  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:00.240307  414292 type.go:168] "Request Body" body=""
	I1217 20:26:00.240425  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:00.240806  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:00.240857  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:00.307216  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:00.372358  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:00.376526  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:00.376564  414292 retry.go:31] will retry after 8.003704403s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:00.740135  414292 type.go:168] "Request Body" body=""
	I1217 20:26:00.740216  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:00.740543  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:01.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:26:01.240281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:01.240535  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:01.713237  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:01.740945  414292 type.go:168] "Request Body" body=""
	I1217 20:26:01.741019  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:01.741278  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:01.769053  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:01.772711  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:01.772742  414292 retry.go:31] will retry after 3.267552643s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:02.240198  414292 type.go:168] "Request Body" body=""
	I1217 20:26:02.240302  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:02.240604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:02.740266  414292 type.go:168] "Request Body" body=""
	I1217 20:26:02.740336  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:02.740681  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:02.740769  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:03.240210  414292 type.go:168] "Request Body" body=""
	I1217 20:26:03.240299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:03.240622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:03.740228  414292 type.go:168] "Request Body" body=""
	I1217 20:26:03.740320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:03.740637  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:04.240516  414292 type.go:168] "Request Body" body=""
	I1217 20:26:04.240588  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:04.240943  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:04.740734  414292 type.go:168] "Request Body" body=""
	I1217 20:26:04.740811  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:04.741190  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:04.741246  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:05.040756  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:05.102503  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:05.102552  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:05.102572  414292 retry.go:31] will retry after 12.344413157s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:05.240841  414292 type.go:168] "Request Body" body=""
	I1217 20:26:05.240913  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:05.241244  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:05.740855  414292 type.go:168] "Request Body" body=""
	I1217 20:26:05.740930  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:05.741188  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:06.241036  414292 type.go:168] "Request Body" body=""
	I1217 20:26:06.241119  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:06.241411  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:06.740129  414292 type.go:168] "Request Body" body=""
	I1217 20:26:06.740212  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:06.740571  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:07.240279  414292 type.go:168] "Request Body" body=""
	I1217 20:26:07.240353  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:07.240608  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:07.240657  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:07.740195  414292 type.go:168] "Request Body" body=""
	I1217 20:26:07.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:07.740591  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:08.240525  414292 type.go:168] "Request Body" body=""
	I1217 20:26:08.240599  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:08.240914  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:08.381383  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:08.435212  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:08.439369  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:08.439410  414292 retry.go:31] will retry after 8.892819822s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:08.740968  414292 type.go:168] "Request Body" body=""
	I1217 20:26:08.741065  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:08.741390  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:09.240148  414292 type.go:168] "Request Body" body=""
	I1217 20:26:09.240230  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:09.240616  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:09.740331  414292 type.go:168] "Request Body" body=""
	I1217 20:26:09.740408  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:09.740742  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:09.740801  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:10.240362  414292 type.go:168] "Request Body" body=""
	I1217 20:26:10.240435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:10.240780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:10.740200  414292 type.go:168] "Request Body" body=""
	I1217 20:26:10.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:10.740646  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:11.240216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:11.240308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:11.240651  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:11.740164  414292 type.go:168] "Request Body" body=""
	I1217 20:26:11.740235  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:11.740510  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:12.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:26:12.240283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:12.240625  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:12.240683  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:12.740202  414292 type.go:168] "Request Body" body=""
	I1217 20:26:12.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:12.740622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:13.240171  414292 type.go:168] "Request Body" body=""
	I1217 20:26:13.240266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:13.240526  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:13.740189  414292 type.go:168] "Request Body" body=""
	I1217 20:26:13.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:13.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:14.240635  414292 type.go:168] "Request Body" body=""
	I1217 20:26:14.240715  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:14.241059  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:14.241125  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:14.741075  414292 type.go:168] "Request Body" body=""
	I1217 20:26:14.741149  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:14.741406  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:15.240079  414292 type.go:168] "Request Body" body=""
	I1217 20:26:15.240155  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:15.240494  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:15.740230  414292 type.go:168] "Request Body" body=""
	I1217 20:26:15.740334  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:15.740683  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:16.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:26:16.240429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:16.240695  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:16.740381  414292 type.go:168] "Request Body" body=""
	I1217 20:26:16.740456  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:16.740780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:16.740834  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:17.240356  414292 type.go:168] "Request Body" body=""
	I1217 20:26:17.240434  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:17.240777  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:17.333063  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:17.388410  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:17.391967  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.391995  414292 retry.go:31] will retry after 13.113728844s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.447345  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:17.505124  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:17.505163  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.505182  414292 retry.go:31] will retry after 11.452403849s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.740560  414292 type.go:168] "Request Body" body=""
	I1217 20:26:17.740629  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:17.740885  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:18.240553  414292 type.go:168] "Request Body" body=""
	I1217 20:26:18.240633  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:18.240967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:18.740512  414292 type.go:168] "Request Body" body=""
	I1217 20:26:18.740589  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:18.740904  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:18.740955  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:19.240885  414292 type.go:168] "Request Body" body=""
	I1217 20:26:19.240962  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:19.241213  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:19.741015  414292 type.go:168] "Request Body" body=""
	I1217 20:26:19.741087  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:19.741404  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:20.240182  414292 type.go:168] "Request Body" body=""
	I1217 20:26:20.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:20.240627  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:20.740108  414292 type.go:168] "Request Body" body=""
	I1217 20:26:20.740183  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:20.740453  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:21.240216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:21.240318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:21.240698  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:21.240754  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:21.740194  414292 type.go:168] "Request Body" body=""
	I1217 20:26:21.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:21.740628  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:22.240933  414292 type.go:168] "Request Body" body=""
	I1217 20:26:22.241005  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:22.241257  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:22.741110  414292 type.go:168] "Request Body" body=""
	I1217 20:26:22.741224  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:22.741585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:23.240288  414292 type.go:168] "Request Body" body=""
	I1217 20:26:23.240369  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:23.240662  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:23.741096  414292 type.go:168] "Request Body" body=""
	I1217 20:26:23.741162  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:23.741447  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:23.741492  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:24.240108  414292 type.go:168] "Request Body" body=""
	I1217 20:26:24.240184  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:24.240503  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:24.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:24.740312  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:24.740605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:25.240111  414292 type.go:168] "Request Body" body=""
	I1217 20:26:25.240206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:25.240472  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:25.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:25.740317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:25.740610  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:26.240350  414292 type.go:168] "Request Body" body=""
	I1217 20:26:26.240423  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:26.240796  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:26.240856  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:26.740354  414292 type.go:168] "Request Body" body=""
	I1217 20:26:26.740433  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:26.740693  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:27.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:26:27.240285  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:27.240571  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:27.740306  414292 type.go:168] "Request Body" body=""
	I1217 20:26:27.740387  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:27.740718  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:28.240518  414292 type.go:168] "Request Body" body=""
	I1217 20:26:28.240588  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:28.240860  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:28.240905  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:28.740699  414292 type.go:168] "Request Body" body=""
	I1217 20:26:28.740776  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:28.741110  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:28.958534  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:29.018842  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:29.024509  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:29.024543  414292 retry.go:31] will retry after 28.006345092s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:29.241080  414292 type.go:168] "Request Body" body=""
	I1217 20:26:29.241159  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:29.241493  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:29.740997  414292 type.go:168] "Request Body" body=""
	I1217 20:26:29.741065  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:29.741356  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:30.241045  414292 type.go:168] "Request Body" body=""
	I1217 20:26:30.241120  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:30.241435  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:30.241493  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:30.505938  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:30.574101  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:30.574147  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:30.574166  414292 retry.go:31] will retry after 31.982210322s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:30.740490  414292 type.go:168] "Request Body" body=""
	I1217 20:26:30.740579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:30.740933  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:31.240692  414292 type.go:168] "Request Body" body=""
	I1217 20:26:31.240768  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:31.248432  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=7
	I1217 20:26:31.740179  414292 type.go:168] "Request Body" body=""
	I1217 20:26:31.740287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:31.740647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:32.240201  414292 type.go:168] "Request Body" body=""
	I1217 20:26:32.240304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:32.240630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:32.740084  414292 type.go:168] "Request Body" body=""
	I1217 20:26:32.740156  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:32.740461  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:32.740520  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:33.240200  414292 type.go:168] "Request Body" body=""
	I1217 20:26:33.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:33.240625  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:33.740223  414292 type.go:168] "Request Body" body=""
	I1217 20:26:33.740319  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:33.740635  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:34.240634  414292 type.go:168] "Request Body" body=""
	I1217 20:26:34.240711  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:34.241019  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:34.740715  414292 type.go:168] "Request Body" body=""
	I1217 20:26:34.740788  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:34.741122  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:34.741178  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:35.240958  414292 type.go:168] "Request Body" body=""
	I1217 20:26:35.241039  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:35.241368  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:35.740050  414292 type.go:168] "Request Body" body=""
	I1217 20:26:35.740126  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:35.740407  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:36.240170  414292 type.go:168] "Request Body" body=""
	I1217 20:26:36.240271  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:36.240623  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:36.740177  414292 type.go:168] "Request Body" body=""
	I1217 20:26:36.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:36.740609  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:37.240114  414292 type.go:168] "Request Body" body=""
	I1217 20:26:37.240213  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:37.240481  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:37.240522  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:37.740165  414292 type.go:168] "Request Body" body=""
	I1217 20:26:37.740239  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:37.740593  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:38.240448  414292 type.go:168] "Request Body" body=""
	I1217 20:26:38.240537  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:38.240850  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:38.740388  414292 type.go:168] "Request Body" body=""
	I1217 20:26:38.740461  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:38.740786  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:39.240612  414292 type.go:168] "Request Body" body=""
	I1217 20:26:39.240699  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:39.241070  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:39.241129  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:39.740905  414292 type.go:168] "Request Body" body=""
	I1217 20:26:39.740985  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:39.741321  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:40.240050  414292 type.go:168] "Request Body" body=""
	I1217 20:26:40.240123  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:40.240466  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:40.740172  414292 type.go:168] "Request Body" body=""
	I1217 20:26:40.740275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:40.740632  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:41.240336  414292 type.go:168] "Request Body" body=""
	I1217 20:26:41.240409  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:41.240744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:41.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:26:41.740423  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:41.740730  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:41.740787  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:42.240202  414292 type.go:168] "Request Body" body=""
	I1217 20:26:42.240311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:42.240673  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:42.740189  414292 type.go:168] "Request Body" body=""
	I1217 20:26:42.740284  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:42.740581  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:43.240122  414292 type.go:168] "Request Body" body=""
	I1217 20:26:43.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:43.240458  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:43.740180  414292 type.go:168] "Request Body" body=""
	I1217 20:26:43.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:43.740597  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:44.240188  414292 type.go:168] "Request Body" body=""
	I1217 20:26:44.240284  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:44.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:44.240672  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:44.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:26:44.740427  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:44.740701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:45.240268  414292 type.go:168] "Request Body" body=""
	I1217 20:26:45.240370  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:45.240810  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:45.740385  414292 type.go:168] "Request Body" body=""
	I1217 20:26:45.740481  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:45.740887  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:46.240669  414292 type.go:168] "Request Body" body=""
	I1217 20:26:46.240750  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:46.241005  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:46.241046  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:46.740832  414292 type.go:168] "Request Body" body=""
	I1217 20:26:46.740907  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:46.741230  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:47.241112  414292 type.go:168] "Request Body" body=""
	I1217 20:26:47.241195  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:47.241535  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:47.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:47.740311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:47.740564  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:48.240565  414292 type.go:168] "Request Body" body=""
	I1217 20:26:48.240648  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:48.240997  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:48.740812  414292 type.go:168] "Request Body" body=""
	I1217 20:26:48.740893  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:48.741250  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:48.741305  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:49.241092  414292 type.go:168] "Request Body" body=""
	I1217 20:26:49.241159  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:49.241410  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:49.740094  414292 type.go:168] "Request Body" body=""
	I1217 20:26:49.740170  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:49.740483  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:50.240219  414292 type.go:168] "Request Body" body=""
	I1217 20:26:50.240334  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:50.240696  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:50.740130  414292 type.go:168] "Request Body" body=""
	I1217 20:26:50.740210  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:50.740538  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:51.240180  414292 type.go:168] "Request Body" body=""
	I1217 20:26:51.240279  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:51.240607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:51.240658  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:51.740209  414292 type.go:168] "Request Body" body=""
	I1217 20:26:51.740323  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:51.740662  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:52.240355  414292 type.go:168] "Request Body" body=""
	I1217 20:26:52.240429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:52.240693  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:52.740381  414292 type.go:168] "Request Body" body=""
	I1217 20:26:52.740464  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:52.740824  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:53.240545  414292 type.go:168] "Request Body" body=""
	I1217 20:26:53.240622  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:53.240967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:53.241022  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:53.740774  414292 type.go:168] "Request Body" body=""
	I1217 20:26:53.740855  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:53.741192  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:54.240965  414292 type.go:168] "Request Body" body=""
	I1217 20:26:54.241045  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:54.241396  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:54.740180  414292 type.go:168] "Request Body" body=""
	I1217 20:26:54.740269  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:54.740570  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:55.240136  414292 type.go:168] "Request Body" body=""
	I1217 20:26:55.240208  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:55.240531  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:55.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:55.740305  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:55.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:55.740689  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:56.240227  414292 type.go:168] "Request Body" body=""
	I1217 20:26:56.240326  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:56.240664  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:56.740128  414292 type.go:168] "Request Body" body=""
	I1217 20:26:56.740207  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:56.740534  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:57.031083  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:57.091368  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:57.091412  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:57.091434  414292 retry.go:31] will retry after 46.71155063s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:57.240719  414292 type.go:168] "Request Body" body=""
	I1217 20:26:57.240799  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:57.241113  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:57.740782  414292 type.go:168] "Request Body" body=""
	I1217 20:26:57.740862  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:57.741143  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:57.741191  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:58.240610  414292 type.go:168] "Request Body" body=""
	I1217 20:26:58.240678  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:58.240925  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:58.740701  414292 type.go:168] "Request Body" body=""
	I1217 20:26:58.740774  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:58.741126  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:59.241090  414292 type.go:168] "Request Body" body=""
	I1217 20:26:59.241163  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:59.241466  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:59.740862  414292 type.go:168] "Request Body" body=""
	I1217 20:26:59.740930  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:59.741174  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:59.741215  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:00.241177  414292 type.go:168] "Request Body" body=""
	I1217 20:27:00.241266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:00.241643  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:00.740463  414292 type.go:168] "Request Body" body=""
	I1217 20:27:00.740543  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:00.740888  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:01.240688  414292 type.go:168] "Request Body" body=""
	I1217 20:27:01.240764  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:01.241063  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:01.740881  414292 type.go:168] "Request Body" body=""
	I1217 20:27:01.740989  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:01.741337  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:01.741388  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:02.240100  414292 type.go:168] "Request Body" body=""
	I1217 20:27:02.240176  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:02.240556  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:02.557038  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:27:02.616976  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:02.620493  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:27:02.620531  414292 retry.go:31] will retry after 42.622456402s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:27:02.740802  414292 type.go:168] "Request Body" body=""
	I1217 20:27:02.740875  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:02.741140  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:03.240977  414292 type.go:168] "Request Body" body=""
	I1217 20:27:03.241074  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:03.241392  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:03.740139  414292 type.go:168] "Request Body" body=""
	I1217 20:27:03.740236  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:03.740586  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:04.240156  414292 type.go:168] "Request Body" body=""
	I1217 20:27:04.240238  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:04.240536  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:04.240579  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:04.740272  414292 type.go:168] "Request Body" body=""
	I1217 20:27:04.740346  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:04.740738  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:05.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:27:05.240287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:05.240617  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:05.740277  414292 type.go:168] "Request Body" body=""
	I1217 20:27:05.740351  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:05.740613  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:06.240183  414292 type.go:168] "Request Body" body=""
	I1217 20:27:06.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:06.240615  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:06.240675  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:06.740224  414292 type.go:168] "Request Body" body=""
	I1217 20:27:06.740355  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:06.740779  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:07.240397  414292 type.go:168] "Request Body" body=""
	I1217 20:27:07.240474  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:07.240744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:07.740218  414292 type.go:168] "Request Body" body=""
	I1217 20:27:07.740311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:07.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:08.240435  414292 type.go:168] "Request Body" body=""
	I1217 20:27:08.240511  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:08.240866  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:08.240920  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:08.740675  414292 type.go:168] "Request Body" body=""
	I1217 20:27:08.740748  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:08.741014  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:09.241042  414292 type.go:168] "Request Body" body=""
	I1217 20:27:09.241128  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:09.241481  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:09.740097  414292 type.go:168] "Request Body" body=""
	I1217 20:27:09.740192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:09.740525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:10.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:27:10.240295  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:10.240559  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:10.740356  414292 type.go:168] "Request Body" body=""
	I1217 20:27:10.740433  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:10.740782  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:10.740855  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:11.240590  414292 type.go:168] "Request Body" body=""
	I1217 20:27:11.240674  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:11.241071  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:11.740833  414292 type.go:168] "Request Body" body=""
	I1217 20:27:11.740915  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:11.741195  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:12.241068  414292 type.go:168] "Request Body" body=""
	I1217 20:27:12.241147  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:12.241476  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:12.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:27:12.740301  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:12.740663  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:13.240124  414292 type.go:168] "Request Body" body=""
	I1217 20:27:13.240197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:13.240538  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:13.240601  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:13.740193  414292 type.go:168] "Request Body" body=""
	I1217 20:27:13.740291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:13.740596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:14.240723  414292 type.go:168] "Request Body" body=""
	I1217 20:27:14.240797  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:14.241150  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:14.740979  414292 type.go:168] "Request Body" body=""
	I1217 20:27:14.741059  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:14.741325  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:15.240063  414292 type.go:168] "Request Body" body=""
	I1217 20:27:15.240146  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:15.240479  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:15.740243  414292 type.go:168] "Request Body" body=""
	I1217 20:27:15.740346  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:15.740681  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:15.740748  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:16.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:27:16.240421  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:16.240686  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:16.740158  414292 type.go:168] "Request Body" body=""
	I1217 20:27:16.740236  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:16.740588  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:17.240299  414292 type.go:168] "Request Body" body=""
	I1217 20:27:17.240374  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:17.240705  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:17.740352  414292 type.go:168] "Request Body" body=""
	I1217 20:27:17.740427  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:17.740687  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:18.240626  414292 type.go:168] "Request Body" body=""
	I1217 20:27:18.240717  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:18.241052  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:18.241112  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:18.740882  414292 type.go:168] "Request Body" body=""
	I1217 20:27:18.740963  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:18.741275  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:19.240070  414292 type.go:168] "Request Body" body=""
	I1217 20:27:19.240154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:19.240424  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:19.740179  414292 type.go:168] "Request Body" body=""
	I1217 20:27:19.740319  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:19.740655  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:20.240358  414292 type.go:168] "Request Body" body=""
	I1217 20:27:20.240437  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:20.240772  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:20.740370  414292 type.go:168] "Request Body" body=""
	I1217 20:27:20.740436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:20.740701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:20.740740  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:21.240460  414292 type.go:168] "Request Body" body=""
	I1217 20:27:21.240550  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:21.240867  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:21.740170  414292 type.go:168] "Request Body" body=""
	I1217 20:27:21.740290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:21.740607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:22.240286  414292 type.go:168] "Request Body" body=""
	I1217 20:27:22.240355  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:22.240607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:22.740330  414292 type.go:168] "Request Body" body=""
	I1217 20:27:22.740422  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:22.740746  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:22.740812  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:23.240211  414292 type.go:168] "Request Body" body=""
	I1217 20:27:23.240304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:23.240668  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:23.740210  414292 type.go:168] "Request Body" body=""
	I1217 20:27:23.740305  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:23.740601  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:24.240106  414292 type.go:168] "Request Body" body=""
	I1217 20:27:24.240203  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:24.240577  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:24.740327  414292 type.go:168] "Request Body" body=""
	I1217 20:27:24.740411  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:24.740719  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:25.240377  414292 type.go:168] "Request Body" body=""
	I1217 20:27:25.240459  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:25.240721  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:25.240761  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:25.740193  414292 type.go:168] "Request Body" body=""
	I1217 20:27:25.740294  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:25.740646  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:26.240192  414292 type.go:168] "Request Body" body=""
	I1217 20:27:26.240307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:26.240648  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:26.741005  414292 type.go:168] "Request Body" body=""
	I1217 20:27:26.741084  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:26.741343  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:27.241172  414292 type.go:168] "Request Body" body=""
	I1217 20:27:27.241259  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:27.241612  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:27.241675  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:27.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:27:27.740292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:27.740610  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:28.240549  414292 type.go:168] "Request Body" body=""
	I1217 20:27:28.240616  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:28.240862  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:28.740178  414292 type.go:168] "Request Body" body=""
	I1217 20:27:28.740278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:28.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:29.240202  414292 type.go:168] "Request Body" body=""
	I1217 20:27:29.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:29.240606  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:29.740117  414292 type.go:168] "Request Body" body=""
	I1217 20:27:29.740197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:29.740469  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:29.740520  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:30.240204  414292 type.go:168] "Request Body" body=""
	I1217 20:27:30.240310  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:30.240594  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:30.740299  414292 type.go:168] "Request Body" body=""
	I1217 20:27:30.740381  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:30.740724  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:31.240991  414292 type.go:168] "Request Body" body=""
	I1217 20:27:31.241062  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:31.241311  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:31.741040  414292 type.go:168] "Request Body" body=""
	I1217 20:27:31.741119  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:31.741431  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:31.741486  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:32.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:27:32.240308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:32.240623  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:32.740134  414292 type.go:168] "Request Body" body=""
	I1217 20:27:32.740210  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:32.740528  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:33.240226  414292 type.go:168] "Request Body" body=""
	I1217 20:27:33.240327  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:33.240647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:33.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:27:33.740286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:33.740611  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:34.240530  414292 type.go:168] "Request Body" body=""
	I1217 20:27:34.240599  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:34.240875  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:34.240919  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:34.740753  414292 type.go:168] "Request Body" body=""
	I1217 20:27:34.740829  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:34.741167  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:35.240992  414292 type.go:168] "Request Body" body=""
	I1217 20:27:35.241075  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:35.241368  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:35.740071  414292 type.go:168] "Request Body" body=""
	I1217 20:27:35.740149  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:35.740453  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:36.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:27:36.240282  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:36.240595  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:36.740308  414292 type.go:168] "Request Body" body=""
	I1217 20:27:36.740384  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:36.740734  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:36.740791  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:37.240354  414292 type.go:168] "Request Body" body=""
	I1217 20:27:37.240426  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:37.240679  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:37.740203  414292 type.go:168] "Request Body" body=""
	I1217 20:27:37.740290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:37.740605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:38.240504  414292 type.go:168] "Request Body" body=""
	I1217 20:27:38.240579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:38.240898  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:38.740390  414292 type.go:168] "Request Body" body=""
	I1217 20:27:38.740487  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:38.740891  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:38.740956  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:39.240903  414292 type.go:168] "Request Body" body=""
	I1217 20:27:39.240984  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:39.241256  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:39.741008  414292 type.go:168] "Request Body" body=""
	I1217 20:27:39.741080  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:39.741404  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:40.241008  414292 type.go:168] "Request Body" body=""
	I1217 20:27:40.241078  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:40.241381  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:40.740111  414292 type.go:168] "Request Body" body=""
	I1217 20:27:40.740212  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:40.740522  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:41.240166  414292 type.go:168] "Request Body" body=""
	I1217 20:27:41.240266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:41.240622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:41.240677  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:41.740082  414292 type.go:168] "Request Body" body=""
	I1217 20:27:41.740154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:41.740465  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:42.240219  414292 type.go:168] "Request Body" body=""
	I1217 20:27:42.240342  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:42.240648  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:42.740376  414292 type.go:168] "Request Body" body=""
	I1217 20:27:42.740469  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:42.740797  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:43.240389  414292 type.go:168] "Request Body" body=""
	I1217 20:27:43.240470  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:43.240795  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:43.240842  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:43.740348  414292 type.go:168] "Request Body" body=""
	I1217 20:27:43.740422  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:43.740975  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:43.803299  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:27:43.859759  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:43.863204  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:43.863297  414292 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1217 20:27:44.241025  414292 type.go:168] "Request Body" body=""
	I1217 20:27:44.241121  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:44.241455  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:44.740148  414292 type.go:168] "Request Body" body=""
	I1217 20:27:44.740221  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:44.740492  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:45.240574  414292 type.go:168] "Request Body" body=""
	I1217 20:27:45.240742  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:45.242019  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=1
	W1217 20:27:45.242186  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:45.244153  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:27:45.319007  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:45.319122  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:45.319226  414292 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1217 20:27:45.322350  414292 out.go:179] * Enabled addons: 
	I1217 20:27:45.325857  414292 addons.go:530] duration metric: took 1m57.123269017s for enable addons: enabled=[]
	I1217 20:27:45.740623  414292 type.go:168] "Request Body" body=""
	I1217 20:27:45.740707  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:45.741048  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:46.240887  414292 type.go:168] "Request Body" body=""
	I1217 20:27:46.240956  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:46.241257  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:46.741061  414292 type.go:168] "Request Body" body=""
	I1217 20:27:46.741135  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:46.741496  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:47.240179  414292 type.go:168] "Request Body" body=""
	I1217 20:27:47.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:47.240598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:47.740172  414292 type.go:168] "Request Body" body=""
	I1217 20:27:47.745570  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:47.746779  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:47.746914  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:48.240563  414292 type.go:168] "Request Body" body=""
	I1217 20:27:48.240642  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:48.240990  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:48.740796  414292 type.go:168] "Request Body" body=""
	I1217 20:27:48.740881  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:48.741219  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:49.240326  414292 type.go:168] "Request Body" body=""
	I1217 20:27:49.240395  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:49.240661  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:49.740181  414292 type.go:168] "Request Body" body=""
	I1217 20:27:49.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:49.740594  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:50.240191  414292 type.go:168] "Request Body" body=""
	I1217 20:27:50.240286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:50.240605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:50.240662  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:50.741072  414292 type.go:168] "Request Body" body=""
	I1217 20:27:50.741139  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:50.741398  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:51.240102  414292 type.go:168] "Request Body" body=""
	I1217 20:27:51.240183  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:51.240525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:51.740138  414292 type.go:168] "Request Body" body=""
	I1217 20:27:51.740212  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:51.740573  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:52.240123  414292 type.go:168] "Request Body" body=""
	I1217 20:27:52.240196  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:52.240556  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:52.740167  414292 type.go:168] "Request Body" body=""
	I1217 20:27:52.740295  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:52.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:52.740683  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:53.240363  414292 type.go:168] "Request Body" body=""
	I1217 20:27:53.240445  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:53.240776  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:53.740345  414292 type.go:168] "Request Body" body=""
	I1217 20:27:53.740444  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:53.740732  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:54.240671  414292 type.go:168] "Request Body" body=""
	I1217 20:27:54.240743  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:54.241055  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:54.740834  414292 type.go:168] "Request Body" body=""
	I1217 20:27:54.740911  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:54.741241  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:54.741301  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:55.241020  414292 type.go:168] "Request Body" body=""
	I1217 20:27:55.241088  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:55.241340  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:55.740115  414292 type.go:168] "Request Body" body=""
	I1217 20:27:55.740205  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:55.740573  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:56.240196  414292 type.go:168] "Request Body" body=""
	I1217 20:27:56.240298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:56.240618  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:56.740302  414292 type.go:168] "Request Body" body=""
	I1217 20:27:56.740375  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:56.740674  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:57.240163  414292 type.go:168] "Request Body" body=""
	I1217 20:27:57.240242  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:57.240572  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:57.240625  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:57.740326  414292 type.go:168] "Request Body" body=""
	I1217 20:27:57.740414  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:57.740758  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:58.240606  414292 type.go:168] "Request Body" body=""
	I1217 20:27:58.240676  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:58.240930  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:58.740703  414292 type.go:168] "Request Body" body=""
	I1217 20:27:58.740776  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:58.741128  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:59.240930  414292 type.go:168] "Request Body" body=""
	I1217 20:27:59.241008  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:59.241330  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:59.241390  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:59.741097  414292 type.go:168] "Request Body" body=""
	I1217 20:27:59.741170  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:59.741444  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:00.240277  414292 type.go:168] "Request Body" body=""
	I1217 20:28:00.240360  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:00.240691  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:00.740372  414292 type.go:168] "Request Body" body=""
	I1217 20:28:00.740453  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:00.740814  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:01.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:28:01.240448  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:01.240754  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:01.740452  414292 type.go:168] "Request Body" body=""
	I1217 20:28:01.740539  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:01.740931  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:01.740984  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:02.240802  414292 type.go:168] "Request Body" body=""
	I1217 20:28:02.240878  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:02.241186  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:02.740889  414292 type.go:168] "Request Body" body=""
	I1217 20:28:02.740958  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:02.741285  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:03.241130  414292 type.go:168] "Request Body" body=""
	I1217 20:28:03.241210  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:03.241568  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:03.740205  414292 type.go:168] "Request Body" body=""
	I1217 20:28:03.740305  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:03.740675  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:04.240435  414292 type.go:168] "Request Body" body=""
	I1217 20:28:04.240501  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:04.240759  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:04.240799  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:04.740206  414292 type.go:168] "Request Body" body=""
	I1217 20:28:04.740298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:04.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:05.240365  414292 type.go:168] "Request Body" body=""
	I1217 20:28:05.240442  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:05.240770  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:05.740360  414292 type.go:168] "Request Body" body=""
	I1217 20:28:05.740435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:05.740725  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:06.240330  414292 type.go:168] "Request Body" body=""
	I1217 20:28:06.240409  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:06.240732  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:06.740442  414292 type.go:168] "Request Body" body=""
	I1217 20:28:06.740525  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:06.740853  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:06.740914  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:07.240349  414292 type.go:168] "Request Body" body=""
	I1217 20:28:07.240423  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:07.240678  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:07.740162  414292 type.go:168] "Request Body" body=""
	I1217 20:28:07.740243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:07.740585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:08.240506  414292 type.go:168] "Request Body" body=""
	I1217 20:28:08.240587  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:08.240906  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:08.740708  414292 type.go:168] "Request Body" body=""
	I1217 20:28:08.740811  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:08.741159  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:08.741223  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:09.241091  414292 type.go:168] "Request Body" body=""
	I1217 20:28:09.241169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:09.241495  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:09.740346  414292 type.go:168] "Request Body" body=""
	I1217 20:28:09.740424  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:09.740766  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:10.240377  414292 type.go:168] "Request Body" body=""
	I1217 20:28:10.240453  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:10.240750  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:10.740215  414292 type.go:168] "Request Body" body=""
	I1217 20:28:10.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:10.740678  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:11.240243  414292 type.go:168] "Request Body" body=""
	I1217 20:28:11.240342  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:11.240712  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:11.240767  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:11.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:28:11.740446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:11.740716  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:12.240420  414292 type.go:168] "Request Body" body=""
	I1217 20:28:12.240502  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:12.240833  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:12.740560  414292 type.go:168] "Request Body" body=""
	I1217 20:28:12.740634  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:12.740952  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:13.240720  414292 type.go:168] "Request Body" body=""
	I1217 20:28:13.240805  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:13.241066  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:13.241122  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:13.740900  414292 type.go:168] "Request Body" body=""
	I1217 20:28:13.740983  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:13.741356  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:14.240146  414292 type.go:168] "Request Body" body=""
	I1217 20:28:14.240277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:14.240625  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:14.740140  414292 type.go:168] "Request Body" body=""
	I1217 20:28:14.740209  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:14.740516  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:15.240203  414292 type.go:168] "Request Body" body=""
	I1217 20:28:15.240304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:15.240633  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:15.740361  414292 type.go:168] "Request Body" body=""
	I1217 20:28:15.740447  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:15.740780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:15.740840  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:16.240359  414292 type.go:168] "Request Body" body=""
	I1217 20:28:16.240436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:16.240823  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:16.740505  414292 type.go:168] "Request Body" body=""
	I1217 20:28:16.740583  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:16.740911  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:17.240693  414292 type.go:168] "Request Body" body=""
	I1217 20:28:17.240772  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:17.241095  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:17.740839  414292 type.go:168] "Request Body" body=""
	I1217 20:28:17.740925  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:17.741192  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:17.741241  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:18.241106  414292 type.go:168] "Request Body" body=""
	I1217 20:28:18.241186  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:18.241520  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:18.740223  414292 type.go:168] "Request Body" body=""
	I1217 20:28:18.740344  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:18.740693  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:19.240508  414292 type.go:168] "Request Body" body=""
	I1217 20:28:19.240581  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:19.240841  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:19.740191  414292 type.go:168] "Request Body" body=""
	I1217 20:28:19.740291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:19.740610  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:20.240363  414292 type.go:168] "Request Body" body=""
	I1217 20:28:20.240438  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:20.240765  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:20.240823  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:20.740348  414292 type.go:168] "Request Body" body=""
	I1217 20:28:20.740426  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:20.740691  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:21.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:28:21.240298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:21.240640  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:21.740375  414292 type.go:168] "Request Body" body=""
	I1217 20:28:21.740465  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:21.740819  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:22.240416  414292 type.go:168] "Request Body" body=""
	I1217 20:28:22.240484  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:22.240741  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:22.740166  414292 type.go:168] "Request Body" body=""
	I1217 20:28:22.740262  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:22.740592  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:22.740654  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:23.240360  414292 type.go:168] "Request Body" body=""
	I1217 20:28:23.240441  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:23.240798  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:23.740367  414292 type.go:168] "Request Body" body=""
	I1217 20:28:23.740434  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:23.740759  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:24.240649  414292 type.go:168] "Request Body" body=""
	I1217 20:28:24.240730  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:24.241071  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:24.740869  414292 type.go:168] "Request Body" body=""
	I1217 20:28:24.740949  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:24.741289  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:24.741351  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:25.241053  414292 type.go:168] "Request Body" body=""
	I1217 20:28:25.241120  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:25.241378  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:25.740052  414292 type.go:168] "Request Body" body=""
	I1217 20:28:25.740126  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:25.740478  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:26.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:28:26.240294  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:26.240602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:26.740121  414292 type.go:168] "Request Body" body=""
	I1217 20:28:26.740197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:26.740507  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:27.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:28:27.240284  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:27.240622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:27.240679  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:27.740368  414292 type.go:168] "Request Body" body=""
	I1217 20:28:27.740447  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:27.740790  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:28.240689  414292 type.go:168] "Request Body" body=""
	I1217 20:28:28.240769  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:28.241066  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:28.740814  414292 type.go:168] "Request Body" body=""
	I1217 20:28:28.740891  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:28.741191  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:29.240973  414292 type.go:168] "Request Body" body=""
	I1217 20:28:29.241045  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:29.241401  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:29.241453  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:29.740076  414292 type.go:168] "Request Body" body=""
	I1217 20:28:29.740154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:29.740474  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:30.240176  414292 type.go:168] "Request Body" body=""
	I1217 20:28:30.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:30.240673  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:30.740407  414292 type.go:168] "Request Body" body=""
	I1217 20:28:30.740484  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:30.740834  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:31.240344  414292 type.go:168] "Request Body" body=""
	I1217 20:28:31.240421  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:31.240680  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:31.740183  414292 type.go:168] "Request Body" body=""
	I1217 20:28:31.740278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:31.740615  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:31.740674  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:32.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:28:32.240291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:32.240620  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:32.740110  414292 type.go:168] "Request Body" body=""
	I1217 20:28:32.740177  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:32.740450  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:33.240131  414292 type.go:168] "Request Body" body=""
	I1217 20:28:33.240205  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:33.240522  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:33.740095  414292 type.go:168] "Request Body" body=""
	I1217 20:28:33.740169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:33.740516  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:34.240083  414292 type.go:168] "Request Body" body=""
	I1217 20:28:34.240169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:34.240471  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:34.240522  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:34.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:28:34.740277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:34.740604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:35.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:28:35.240295  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:35.240596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:35.740083  414292 type.go:168] "Request Body" body=""
	I1217 20:28:35.740163  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:35.740501  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:36.240147  414292 type.go:168] "Request Body" body=""
	I1217 20:28:36.240220  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:36.240565  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:36.240620  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:36.740181  414292 type.go:168] "Request Body" body=""
	I1217 20:28:36.740270  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:36.740569  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:37.240126  414292 type.go:168] "Request Body" body=""
	I1217 20:28:37.240206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:37.240510  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:37.740120  414292 type.go:168] "Request Body" body=""
	I1217 20:28:37.740203  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:37.740533  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:38.240393  414292 type.go:168] "Request Body" body=""
	I1217 20:28:38.240473  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:38.240804  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:38.240859  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:38.740418  414292 type.go:168] "Request Body" body=""
	I1217 20:28:38.740493  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:38.740792  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:39.240679  414292 type.go:168] "Request Body" body=""
	I1217 20:28:39.240778  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:39.241109  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:39.740934  414292 type.go:168] "Request Body" body=""
	I1217 20:28:39.741010  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:39.741373  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:40.240097  414292 type.go:168] "Request Body" body=""
	I1217 20:28:40.240172  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:40.240452  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:40.740149  414292 type.go:168] "Request Body" body=""
	I1217 20:28:40.740221  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:40.740580  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:40.740634  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:41.240170  414292 type.go:168] "Request Body" body=""
	I1217 20:28:41.240273  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:41.240600  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:41.740062  414292 type.go:168] "Request Body" body=""
	I1217 20:28:41.740137  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:41.740430  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:42.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:28:42.240290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:42.240668  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:42.740377  414292 type.go:168] "Request Body" body=""
	I1217 20:28:42.740455  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:42.740779  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:42.740831  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:43.240367  414292 type.go:168] "Request Body" body=""
	I1217 20:28:43.240476  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:43.240764  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:43.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:28:43.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:43.740560  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:44.240459  414292 type.go:168] "Request Body" body=""
	I1217 20:28:44.240534  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:44.240870  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:44.740354  414292 type.go:168] "Request Body" body=""
	I1217 20:28:44.740429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:44.740695  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:45.240439  414292 type.go:168] "Request Body" body=""
	I1217 20:28:45.240568  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:45.241041  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:45.241102  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:45.740871  414292 type.go:168] "Request Body" body=""
	I1217 20:28:45.740956  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:45.741304  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:46.241069  414292 type.go:168] "Request Body" body=""
	I1217 20:28:46.241138  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:46.241455  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:46.740186  414292 type.go:168] "Request Body" body=""
	I1217 20:28:46.740277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:46.740602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:47.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:28:47.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:47.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:47.740189  414292 type.go:168] "Request Body" body=""
	I1217 20:28:47.740282  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:47.740601  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:47.740657  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:48.240600  414292 type.go:168] "Request Body" body=""
	I1217 20:28:48.240678  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:48.241014  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:48.740811  414292 type.go:168] "Request Body" body=""
	I1217 20:28:48.740888  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:48.741185  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:49.240973  414292 type.go:168] "Request Body" body=""
	I1217 20:28:49.241051  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:49.241327  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:49.741121  414292 type.go:168] "Request Body" body=""
	I1217 20:28:49.741215  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:49.741596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:49.741649  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:50.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:28:50.240243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:50.240599  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:50.740121  414292 type.go:168] "Request Body" body=""
	I1217 20:28:50.740197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:50.740508  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:51.240167  414292 type.go:168] "Request Body" body=""
	I1217 20:28:51.240239  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:51.240596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:51.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:28:51.740310  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:51.740686  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:52.240373  414292 type.go:168] "Request Body" body=""
	I1217 20:28:52.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:52.240709  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:52.240761  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:52.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:28:52.740289  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:52.740584  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:53.240185  414292 type.go:168] "Request Body" body=""
	I1217 20:28:53.240278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:53.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:53.740309  414292 type.go:168] "Request Body" body=""
	I1217 20:28:53.740380  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:53.740678  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:54.240655  414292 type.go:168] "Request Body" body=""
	I1217 20:28:54.240729  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:54.241066  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:54.241123  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:54.740804  414292 type.go:168] "Request Body" body=""
	I1217 20:28:54.740895  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:54.741266  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:55.241066  414292 type.go:168] "Request Body" body=""
	I1217 20:28:55.241142  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:55.241410  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:55.740129  414292 type.go:168] "Request Body" body=""
	I1217 20:28:55.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:55.740615  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:56.240178  414292 type.go:168] "Request Body" body=""
	I1217 20:28:56.240274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:56.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:56.740206  414292 type.go:168] "Request Body" body=""
	I1217 20:28:56.740289  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:56.740613  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:56.740664  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:57.240172  414292 type.go:168] "Request Body" body=""
	I1217 20:28:57.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:57.240598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:57.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:28:57.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:57.740644  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:58.240496  414292 type.go:168] "Request Body" body=""
	I1217 20:28:58.240566  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:58.240825  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:58.740558  414292 type.go:168] "Request Body" body=""
	I1217 20:28:58.740638  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:58.740975  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:58.741026  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:59.240760  414292 type.go:168] "Request Body" body=""
	I1217 20:28:59.240835  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:59.241171  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:59.740992  414292 type.go:168] "Request Body" body=""
	I1217 20:28:59.741067  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:59.741325  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:00.241179  414292 type.go:168] "Request Body" body=""
	I1217 20:29:00.241274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:00.241594  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:00.740545  414292 type.go:168] "Request Body" body=""
	I1217 20:29:00.740619  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:00.740922  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:01.240654  414292 type.go:168] "Request Body" body=""
	I1217 20:29:01.240729  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:01.241023  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:01.241072  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:01.740787  414292 type.go:168] "Request Body" body=""
	I1217 20:29:01.740865  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:01.741183  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:02.240971  414292 type.go:168] "Request Body" body=""
	I1217 20:29:02.241051  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:02.241393  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:02.740980  414292 type.go:168] "Request Body" body=""
	I1217 20:29:02.741059  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:02.741391  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:03.240071  414292 type.go:168] "Request Body" body=""
	I1217 20:29:03.240147  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:03.240491  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:03.740095  414292 type.go:168] "Request Body" body=""
	I1217 20:29:03.740172  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:03.740538  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:03.740593  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:04.240122  414292 type.go:168] "Request Body" body=""
	I1217 20:29:04.240206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:04.240574  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:04.740282  414292 type.go:168] "Request Body" body=""
	I1217 20:29:04.740362  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:04.740712  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:05.240168  414292 type.go:168] "Request Body" body=""
	I1217 20:29:05.240264  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:05.240599  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:05.740352  414292 type.go:168] "Request Body" body=""
	I1217 20:29:05.740431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:05.740697  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:05.740738  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:06.240370  414292 type.go:168] "Request Body" body=""
	I1217 20:29:06.240445  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:06.240788  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:06.741116  414292 type.go:168] "Request Body" body=""
	I1217 20:29:06.741191  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:06.741539  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:07.240080  414292 type.go:168] "Request Body" body=""
	I1217 20:29:07.240155  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:07.240463  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:07.740201  414292 type.go:168] "Request Body" body=""
	I1217 20:29:07.740303  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:07.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:08.240575  414292 type.go:168] "Request Body" body=""
	I1217 20:29:08.240648  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:08.241002  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:08.241060  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:08.740757  414292 type.go:168] "Request Body" body=""
	I1217 20:29:08.740826  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:08.741089  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:09.241059  414292 type.go:168] "Request Body" body=""
	I1217 20:29:09.241165  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:09.241510  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:09.740220  414292 type.go:168] "Request Body" body=""
	I1217 20:29:09.740308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:09.740656  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:10.240344  414292 type.go:168] "Request Body" body=""
	I1217 20:29:10.240444  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:10.240756  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:10.740205  414292 type.go:168] "Request Body" body=""
	I1217 20:29:10.740293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:10.740629  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:10.740685  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:11.240376  414292 type.go:168] "Request Body" body=""
	I1217 20:29:11.240454  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:11.240798  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:11.740377  414292 type.go:168] "Request Body" body=""
	I1217 20:29:11.740475  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:11.740809  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:12.240548  414292 type.go:168] "Request Body" body=""
	I1217 20:29:12.240621  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:12.240958  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:12.740753  414292 type.go:168] "Request Body" body=""
	I1217 20:29:12.740831  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:12.741131  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:12.741180  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:13.240743  414292 type.go:168] "Request Body" body=""
	I1217 20:29:13.240816  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:13.241082  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:13.740877  414292 type.go:168] "Request Body" body=""
	I1217 20:29:13.740957  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:13.741251  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:14.241067  414292 type.go:168] "Request Body" body=""
	I1217 20:29:14.241141  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:14.241456  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:14.740157  414292 type.go:168] "Request Body" body=""
	I1217 20:29:14.740261  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:14.740657  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:15.240363  414292 type.go:168] "Request Body" body=""
	I1217 20:29:15.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:15.240796  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:15.240849  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:15.740520  414292 type.go:168] "Request Body" body=""
	I1217 20:29:15.740600  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:15.740926  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:16.240670  414292 type.go:168] "Request Body" body=""
	I1217 20:29:16.240741  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:16.241005  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:16.740812  414292 type.go:168] "Request Body" body=""
	I1217 20:29:16.740895  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:16.741250  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:17.241030  414292 type.go:168] "Request Body" body=""
	I1217 20:29:17.241104  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:17.241485  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:17.241544  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:17.741007  414292 type.go:168] "Request Body" body=""
	I1217 20:29:17.741075  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:17.741330  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:18.240351  414292 type.go:168] "Request Body" body=""
	I1217 20:29:18.240431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:18.240777  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:18.740148  414292 type.go:168] "Request Body" body=""
	I1217 20:29:18.740233  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:18.740593  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:19.240118  414292 type.go:168] "Request Body" body=""
	I1217 20:29:19.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:19.240527  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:19.740167  414292 type.go:168] "Request Body" body=""
	I1217 20:29:19.740241  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:19.740598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:19.740652  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:20.240357  414292 type.go:168] "Request Body" body=""
	I1217 20:29:20.240435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:20.240764  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:20.740360  414292 type.go:168] "Request Body" body=""
	I1217 20:29:20.740433  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:20.740702  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:21.240449  414292 type.go:168] "Request Body" body=""
	I1217 20:29:21.240532  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:21.240864  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:21.740699  414292 type.go:168] "Request Body" body=""
	I1217 20:29:21.740783  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:21.741119  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:21.741177  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:22.240878  414292 type.go:168] "Request Body" body=""
	I1217 20:29:22.240947  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:22.241200  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:22.740985  414292 type.go:168] "Request Body" body=""
	I1217 20:29:22.741058  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:22.741409  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:23.240121  414292 type.go:168] "Request Body" body=""
	I1217 20:29:23.240195  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:23.240546  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:23.740154  414292 type.go:168] "Request Body" body=""
	I1217 20:29:23.740239  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:23.740563  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:24.240534  414292 type.go:168] "Request Body" body=""
	I1217 20:29:24.240612  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:24.240947  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:24.241000  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:24.740777  414292 type.go:168] "Request Body" body=""
	I1217 20:29:24.740857  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:24.741204  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:25.240998  414292 type.go:168] "Request Body" body=""
	I1217 20:29:25.241066  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:25.241333  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:25.741123  414292 type.go:168] "Request Body" body=""
	I1217 20:29:25.741202  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:25.741530  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:26.240201  414292 type.go:168] "Request Body" body=""
	I1217 20:29:26.240299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:26.240642  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:26.740361  414292 type.go:168] "Request Body" body=""
	I1217 20:29:26.740432  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:26.740744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:26.740792  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:27.240174  414292 type.go:168] "Request Body" body=""
	I1217 20:29:27.240243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:27.240585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:27.740308  414292 type.go:168] "Request Body" body=""
	I1217 20:29:27.740392  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:27.740767  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:28.240517  414292 type.go:168] "Request Body" body=""
	I1217 20:29:28.240588  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:28.240857  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:28.740679  414292 type.go:168] "Request Body" body=""
	I1217 20:29:28.740752  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:28.741120  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:28.741173  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:29.240914  414292 type.go:168] "Request Body" body=""
	I1217 20:29:29.240995  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:29.241335  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:29.740969  414292 type.go:168] "Request Body" body=""
	I1217 20:29:29.741045  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:29.741327  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:30.240069  414292 type.go:168] "Request Body" body=""
	I1217 20:29:30.240150  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:30.240512  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:30.740197  414292 type.go:168] "Request Body" body=""
	I1217 20:29:30.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:30.740646  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:31.240121  414292 type.go:168] "Request Body" body=""
	I1217 20:29:31.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:31.240521  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:31.240571  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:31.740162  414292 type.go:168] "Request Body" body=""
	I1217 20:29:31.740238  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:31.740576  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:32.240165  414292 type.go:168] "Request Body" body=""
	I1217 20:29:32.240262  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:32.240572  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:32.740149  414292 type.go:168] "Request Body" body=""
	I1217 20:29:32.740217  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:32.740555  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:33.240266  414292 type.go:168] "Request Body" body=""
	I1217 20:29:33.240342  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:33.240665  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:33.240725  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:33.740182  414292 type.go:168] "Request Body" body=""
	I1217 20:29:33.740280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:33.740619  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:34.240111  414292 type.go:168] "Request Body" body=""
	I1217 20:29:34.240178  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:34.240456  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:34.740169  414292 type.go:168] "Request Body" body=""
	I1217 20:29:34.740275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:34.740676  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:35.240238  414292 type.go:168] "Request Body" body=""
	I1217 20:29:35.240333  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:35.240684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:35.740366  414292 type.go:168] "Request Body" body=""
	I1217 20:29:35.740431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:35.740683  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:35.740724  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:36.240205  414292 type.go:168] "Request Body" body=""
	I1217 20:29:36.240297  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:36.240641  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:36.740372  414292 type.go:168] "Request Body" body=""
	I1217 20:29:36.740448  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:36.740761  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:37.240379  414292 type.go:168] "Request Body" body=""
	I1217 20:29:37.240448  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:37.240766  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:37.740215  414292 type.go:168] "Request Body" body=""
	I1217 20:29:37.740301  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:37.740614  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:38.240590  414292 type.go:168] "Request Body" body=""
	I1217 20:29:38.240670  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:38.241007  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:38.241051  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:38.740806  414292 type.go:168] "Request Body" body=""
	I1217 20:29:38.740880  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:38.741145  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:39.240077  414292 type.go:168] "Request Body" body=""
	I1217 20:29:39.240158  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:39.240533  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:39.740138  414292 type.go:168] "Request Body" body=""
	I1217 20:29:39.740216  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:39.740575  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:40.240271  414292 type.go:168] "Request Body" body=""
	I1217 20:29:40.240352  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:40.240630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:40.740330  414292 type.go:168] "Request Body" body=""
	I1217 20:29:40.740414  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:40.740751  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:40.740807  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:41.240185  414292 type.go:168] "Request Body" body=""
	I1217 20:29:41.240278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:41.240603  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:41.740073  414292 type.go:168] "Request Body" body=""
	I1217 20:29:41.740152  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:41.740438  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:42.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:29:42.240310  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:42.240671  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:42.740385  414292 type.go:168] "Request Body" body=""
	I1217 20:29:42.740463  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:42.740790  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:42.740846  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:43.240351  414292 type.go:168] "Request Body" body=""
	I1217 20:29:43.240416  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:43.240662  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:43.740171  414292 type.go:168] "Request Body" body=""
	I1217 20:29:43.740242  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:43.740569  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:44.240470  414292 type.go:168] "Request Body" body=""
	I1217 20:29:44.240555  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:44.241028  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:44.740766  414292 type.go:168] "Request Body" body=""
	I1217 20:29:44.740836  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:44.741107  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:44.741148  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:45.241078  414292 type.go:168] "Request Body" body=""
	I1217 20:29:45.241164  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:45.241604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:45.740201  414292 type.go:168] "Request Body" body=""
	I1217 20:29:45.740298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:45.740651  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:46.240352  414292 type.go:168] "Request Body" body=""
	I1217 20:29:46.240425  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:46.240697  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:46.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:29:46.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:46.740656  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:47.240376  414292 type.go:168] "Request Body" body=""
	I1217 20:29:47.240462  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:47.240824  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:47.240891  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:47.740486  414292 type.go:168] "Request Body" body=""
	I1217 20:29:47.740555  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:47.740822  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:48.240800  414292 type.go:168] "Request Body" body=""
	I1217 20:29:48.240885  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:48.241255  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:48.741124  414292 type.go:168] "Request Body" body=""
	I1217 20:29:48.741203  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:48.741664  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:49.240445  414292 type.go:168] "Request Body" body=""
	I1217 20:29:49.240522  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:49.240794  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:49.743007  414292 type.go:168] "Request Body" body=""
	I1217 20:29:49.743084  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:49.743421  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:49.743497  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:50.241133  414292 type.go:168] "Request Body" body=""
	I1217 20:29:50.241229  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:50.241648  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:50.740153  414292 type.go:168] "Request Body" body=""
	I1217 20:29:50.740228  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:50.740573  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:51.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:29:51.240301  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:51.240639  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:51.740398  414292 type.go:168] "Request Body" body=""
	I1217 20:29:51.740482  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:51.740812  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:52.240354  414292 type.go:168] "Request Body" body=""
	I1217 20:29:52.240429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:52.240749  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:52.240807  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:52.740175  414292 type.go:168] "Request Body" body=""
	I1217 20:29:52.740270  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:52.740621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:53.240207  414292 type.go:168] "Request Body" body=""
	I1217 20:29:53.240318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:53.240664  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:53.740074  414292 type.go:168] "Request Body" body=""
	I1217 20:29:53.740144  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:53.740420  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:54.240102  414292 type.go:168] "Request Body" body=""
	I1217 20:29:54.240179  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:54.240492  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:54.740171  414292 type.go:168] "Request Body" body=""
	I1217 20:29:54.740266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:54.740580  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:54.740639  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:55.240282  414292 type.go:168] "Request Body" body=""
	I1217 20:29:55.240356  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:55.240697  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:55.740173  414292 type.go:168] "Request Body" body=""
	I1217 20:29:55.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:55.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:56.240062  414292 type.go:168] "Request Body" body=""
	I1217 20:29:56.240140  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:56.240485  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:56.740174  414292 type.go:168] "Request Body" body=""
	I1217 20:29:56.740243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:56.740524  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:57.240191  414292 type.go:168] "Request Body" body=""
	I1217 20:29:57.240286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:57.240658  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:57.240715  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:57.740239  414292 type.go:168] "Request Body" body=""
	I1217 20:29:57.740339  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:57.740672  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:58.240460  414292 type.go:168] "Request Body" body=""
	I1217 20:29:58.240543  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:58.240844  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:58.740210  414292 type.go:168] "Request Body" body=""
	I1217 20:29:58.740304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:58.740643  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:59.240481  414292 type.go:168] "Request Body" body=""
	I1217 20:29:59.240553  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:59.240919  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:59.240975  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:59.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:29:59.740429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:59.740722  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:00.240635  414292 type.go:168] "Request Body" body=""
	I1217 20:30:00.240720  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:00.241049  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:00.741056  414292 type.go:168] "Request Body" body=""
	I1217 20:30:00.741139  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:00.741446  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:01.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:30:01.240243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:01.240569  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:01.740198  414292 type.go:168] "Request Body" body=""
	I1217 20:30:01.740313  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:01.740647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:01.740706  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:02.240225  414292 type.go:168] "Request Body" body=""
	I1217 20:30:02.240325  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:02.240684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:02.740220  414292 type.go:168] "Request Body" body=""
	I1217 20:30:02.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:02.740599  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:03.240198  414292 type.go:168] "Request Body" body=""
	I1217 20:30:03.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:03.240638  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:03.740375  414292 type.go:168] "Request Body" body=""
	I1217 20:30:03.740447  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:03.740781  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:03.740851  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:04.240827  414292 type.go:168] "Request Body" body=""
	I1217 20:30:04.240905  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:04.241246  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:04.741038  414292 type.go:168] "Request Body" body=""
	I1217 20:30:04.741117  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:04.741482  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:05.240086  414292 type.go:168] "Request Body" body=""
	I1217 20:30:05.240166  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:05.240523  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:05.740218  414292 type.go:168] "Request Body" body=""
	I1217 20:30:05.740313  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:05.740583  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:06.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:30:06.240296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:06.240608  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:06.240657  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:06.740166  414292 type.go:168] "Request Body" body=""
	I1217 20:30:06.740292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:06.740641  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:07.240137  414292 type.go:168] "Request Body" body=""
	I1217 20:30:07.240232  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:07.240542  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:07.740212  414292 type.go:168] "Request Body" body=""
	I1217 20:30:07.740319  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:07.740677  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:08.240572  414292 type.go:168] "Request Body" body=""
	I1217 20:30:08.240703  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:08.241012  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:08.241060  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:08.740470  414292 type.go:168] "Request Body" body=""
	I1217 20:30:08.740547  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:08.740807  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:09.240838  414292 type.go:168] "Request Body" body=""
	I1217 20:30:09.240937  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:09.241278  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:09.741105  414292 type.go:168] "Request Body" body=""
	I1217 20:30:09.741196  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:09.741522  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:10.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:30:10.240237  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:10.240593  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:10.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:30:10.740286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:10.740639  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:10.740692  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:11.240192  414292 type.go:168] "Request Body" body=""
	I1217 20:30:11.240314  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:11.240666  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:11.740282  414292 type.go:168] "Request Body" body=""
	I1217 20:30:11.740357  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:11.740632  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:12.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:30:12.240296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:12.240602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:12.740302  414292 type.go:168] "Request Body" body=""
	I1217 20:30:12.740382  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:12.740745  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:12.740799  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:13.240268  414292 type.go:168] "Request Body" body=""
	I1217 20:30:13.240338  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:13.240649  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:13.740420  414292 type.go:168] "Request Body" body=""
	I1217 20:30:13.740503  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:13.740905  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:14.240722  414292 type.go:168] "Request Body" body=""
	I1217 20:30:14.240802  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:14.241096  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:14.740830  414292 type.go:168] "Request Body" body=""
	I1217 20:30:14.740904  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:14.741180  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:14.741236  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:15.241024  414292 type.go:168] "Request Body" body=""
	I1217 20:30:15.241104  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:15.241450  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:15.741152  414292 type.go:168] "Request Body" body=""
	I1217 20:30:15.741234  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:15.741523  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:16.240202  414292 type.go:168] "Request Body" body=""
	I1217 20:30:16.240287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:16.240602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:16.740191  414292 type.go:168] "Request Body" body=""
	I1217 20:30:16.740289  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:16.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:17.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:30:17.240275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:17.240605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:17.240659  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:17.740234  414292 type.go:168] "Request Body" body=""
	I1217 20:30:17.740318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:17.740651  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:18.240499  414292 type.go:168] "Request Body" body=""
	I1217 20:30:18.240576  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:18.240897  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:18.740209  414292 type.go:168] "Request Body" body=""
	I1217 20:30:18.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:18.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:19.240591  414292 type.go:168] "Request Body" body=""
	I1217 20:30:19.240664  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:19.240919  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:19.240962  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:19.740705  414292 type.go:168] "Request Body" body=""
	I1217 20:30:19.740783  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:19.741126  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:20.240841  414292 type.go:168] "Request Body" body=""
	I1217 20:30:20.240934  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:20.241283  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:20.741051  414292 type.go:168] "Request Body" body=""
	I1217 20:30:20.741126  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:20.741393  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:21.240114  414292 type.go:168] "Request Body" body=""
	I1217 20:30:21.240193  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:21.240534  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:21.740204  414292 type.go:168] "Request Body" body=""
	I1217 20:30:21.740298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:21.740636  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:21.740690  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:22.240114  414292 type.go:168] "Request Body" body=""
	I1217 20:30:22.240185  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:22.240483  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:22.740163  414292 type.go:168] "Request Body" body=""
	I1217 20:30:22.740266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:22.740598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:23.240329  414292 type.go:168] "Request Body" body=""
	I1217 20:30:23.240410  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:23.240708  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:23.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:30:23.740421  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:23.740715  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:23.740754  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:24.240749  414292 type.go:168] "Request Body" body=""
	I1217 20:30:24.240827  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:24.241165  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:24.740945  414292 type.go:168] "Request Body" body=""
	I1217 20:30:24.741018  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:24.741341  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:25.241107  414292 type.go:168] "Request Body" body=""
	I1217 20:30:25.241181  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:25.241513  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:25.740206  414292 type.go:168] "Request Body" body=""
	I1217 20:30:25.740297  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:25.740647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:26.240231  414292 type.go:168] "Request Body" body=""
	I1217 20:30:26.240323  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:26.240661  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:26.240712  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:26.740311  414292 type.go:168] "Request Body" body=""
	I1217 20:30:26.740386  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:26.740706  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:27.240152  414292 type.go:168] "Request Body" body=""
	I1217 20:30:27.240224  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:27.240591  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:27.740314  414292 type.go:168] "Request Body" body=""
	I1217 20:30:27.740399  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:27.740736  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:28.240510  414292 type.go:168] "Request Body" body=""
	I1217 20:30:28.240580  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:28.240845  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:28.240885  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:28.740684  414292 type.go:168] "Request Body" body=""
	I1217 20:30:28.740769  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:28.741122  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:29.240868  414292 type.go:168] "Request Body" body=""
	I1217 20:30:29.240943  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:29.241278  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:29.741010  414292 type.go:168] "Request Body" body=""
	I1217 20:30:29.741088  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:29.741347  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:30.240074  414292 type.go:168] "Request Body" body=""
	I1217 20:30:30.240156  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:30.240536  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:30.740270  414292 type.go:168] "Request Body" body=""
	I1217 20:30:30.740350  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:30.740682  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:30.740742  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:31.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:30:31.240426  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:31.240709  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:31.740175  414292 type.go:168] "Request Body" body=""
	I1217 20:30:31.740274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:31.740607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:32.240196  414292 type.go:168] "Request Body" body=""
	I1217 20:30:32.240290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:32.240644  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:32.740323  414292 type.go:168] "Request Body" body=""
	I1217 20:30:32.740399  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:32.740721  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:32.740777  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:33.240191  414292 type.go:168] "Request Body" body=""
	I1217 20:30:33.240282  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:33.240581  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:33.740289  414292 type.go:168] "Request Body" body=""
	I1217 20:30:33.740384  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:33.740725  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:34.240516  414292 type.go:168] "Request Body" body=""
	I1217 20:30:34.240601  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:34.240877  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:34.740166  414292 type.go:168] "Request Body" body=""
	I1217 20:30:34.740263  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:34.740598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:35.240329  414292 type.go:168] "Request Body" body=""
	I1217 20:30:35.240407  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:35.240744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:35.240807  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:35.740356  414292 type.go:168] "Request Body" body=""
	I1217 20:30:35.740430  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:35.740684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:36.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:30:36.240292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:36.240620  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:36.740204  414292 type.go:168] "Request Body" body=""
	I1217 20:30:36.740299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:36.740621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:37.240083  414292 type.go:168] "Request Body" body=""
	I1217 20:30:37.240154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:37.240449  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:37.740150  414292 type.go:168] "Request Body" body=""
	I1217 20:30:37.740225  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:37.740559  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:37.740619  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:38.240565  414292 type.go:168] "Request Body" body=""
	I1217 20:30:38.240642  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:38.240952  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:38.740678  414292 type.go:168] "Request Body" body=""
	I1217 20:30:38.740753  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:38.741008  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:39.241038  414292 type.go:168] "Request Body" body=""
	I1217 20:30:39.241115  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:39.241418  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:39.740131  414292 type.go:168] "Request Body" body=""
	I1217 20:30:39.740209  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:39.740536  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:40.240118  414292 type.go:168] "Request Body" body=""
	I1217 20:30:40.240199  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:40.240498  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:40.240543  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:40.740108  414292 type.go:168] "Request Body" body=""
	I1217 20:30:40.740188  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:40.740547  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:41.240309  414292 type.go:168] "Request Body" body=""
	I1217 20:30:41.240393  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:41.240720  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:41.740366  414292 type.go:168] "Request Body" body=""
	I1217 20:30:41.740436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:41.740701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:42.240206  414292 type.go:168] "Request Body" body=""
	I1217 20:30:42.240307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:42.240670  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:42.240729  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:42.740417  414292 type.go:168] "Request Body" body=""
	I1217 20:30:42.740498  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:42.740825  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:43.240360  414292 type.go:168] "Request Body" body=""
	I1217 20:30:43.240441  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:43.240782  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:43.740183  414292 type.go:168] "Request Body" body=""
	I1217 20:30:43.740276  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:43.740607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:44.240187  414292 type.go:168] "Request Body" body=""
	I1217 20:30:44.240292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:44.240612  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:44.740076  414292 type.go:168] "Request Body" body=""
	I1217 20:30:44.740144  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:44.740421  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:44.740464  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:45.240192  414292 type.go:168] "Request Body" body=""
	I1217 20:30:45.240361  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:45.240784  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:45.740213  414292 type.go:168] "Request Body" body=""
	I1217 20:30:45.740320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:45.740704  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:46.241036  414292 type.go:168] "Request Body" body=""
	I1217 20:30:46.241111  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:46.241363  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:46.741131  414292 type.go:168] "Request Body" body=""
	I1217 20:30:46.741206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:46.741497  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:46.741543  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:47.240089  414292 type.go:168] "Request Body" body=""
	I1217 20:30:47.240162  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:47.240507  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:47.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:30:47.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:47.740533  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:48.240579  414292 type.go:168] "Request Body" body=""
	I1217 20:30:48.240662  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:48.240968  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:48.740776  414292 type.go:168] "Request Body" body=""
	I1217 20:30:48.740848  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:48.741156  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:49.241082  414292 type.go:168] "Request Body" body=""
	I1217 20:30:49.241169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:49.241428  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:49.241479  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:49.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:30:49.740280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:49.740617  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:50.240370  414292 type.go:168] "Request Body" body=""
	I1217 20:30:50.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:50.240786  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:50.740363  414292 type.go:168] "Request Body" body=""
	I1217 20:30:50.740431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:50.740703  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:51.240208  414292 type.go:168] "Request Body" body=""
	I1217 20:30:51.240317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:51.240620  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:51.740346  414292 type.go:168] "Request Body" body=""
	I1217 20:30:51.740425  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:51.740761  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:51.740821  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:52.240214  414292 type.go:168] "Request Body" body=""
	I1217 20:30:52.240300  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:52.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:52.740220  414292 type.go:168] "Request Body" body=""
	I1217 20:30:52.740317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:52.740663  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:53.240242  414292 type.go:168] "Request Body" body=""
	I1217 20:30:53.240341  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:53.240677  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:53.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:30:53.740434  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:53.740762  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:54.240735  414292 type.go:168] "Request Body" body=""
	I1217 20:30:54.240807  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:54.241141  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:54.241194  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:54.740836  414292 type.go:168] "Request Body" body=""
	I1217 20:30:54.740927  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:54.741298  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:55.241043  414292 type.go:168] "Request Body" body=""
	I1217 20:30:55.241118  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:55.241396  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:55.740103  414292 type.go:168] "Request Body" body=""
	I1217 20:30:55.740188  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:55.740556  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:56.240179  414292 type.go:168] "Request Body" body=""
	I1217 20:30:56.240272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:56.240684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:56.740352  414292 type.go:168] "Request Body" body=""
	I1217 20:30:56.740424  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:56.740685  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:56.740726  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:57.240185  414292 type.go:168] "Request Body" body=""
	I1217 20:30:57.240275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:57.240613  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:57.740358  414292 type.go:168] "Request Body" body=""
	I1217 20:30:57.740443  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:57.740790  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:58.240510  414292 type.go:168] "Request Body" body=""
	I1217 20:30:58.240581  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:58.240843  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:58.740171  414292 type.go:168] "Request Body" body=""
	I1217 20:30:58.740269  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:58.740606  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:59.240200  414292 type.go:168] "Request Body" body=""
	I1217 20:30:59.240287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:59.240645  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:59.240707  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:59.740354  414292 type.go:168] "Request Body" body=""
	I1217 20:30:59.740428  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:59.740694  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:00.240295  414292 type.go:168] "Request Body" body=""
	I1217 20:31:00.240376  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:00.240702  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:00.740561  414292 type.go:168] "Request Body" body=""
	I1217 20:31:00.740646  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:00.741014  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:01.240812  414292 type.go:168] "Request Body" body=""
	I1217 20:31:01.240892  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:01.241169  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:01.241213  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:01.740960  414292 type.go:168] "Request Body" body=""
	I1217 20:31:01.741043  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:01.741371  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:02.240103  414292 type.go:168] "Request Body" body=""
	I1217 20:31:02.240187  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:02.240566  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:02.740227  414292 type.go:168] "Request Body" body=""
	I1217 20:31:02.740315  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:02.740637  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:03.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:31:03.240291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:03.240590  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:03.740279  414292 type.go:168] "Request Body" body=""
	I1217 20:31:03.740349  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:03.740684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:03.740743  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:04.240506  414292 type.go:168] "Request Body" body=""
	I1217 20:31:04.240579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:04.240829  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:04.740200  414292 type.go:168] "Request Body" body=""
	I1217 20:31:04.740299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:04.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:05.240188  414292 type.go:168] "Request Body" body=""
	I1217 20:31:05.240285  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:05.240600  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:05.740119  414292 type.go:168] "Request Body" body=""
	I1217 20:31:05.740198  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:05.740527  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:06.240220  414292 type.go:168] "Request Body" body=""
	I1217 20:31:06.240320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:06.240652  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:06.240704  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:06.740395  414292 type.go:168] "Request Body" body=""
	I1217 20:31:06.740474  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:06.740826  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:07.240362  414292 type.go:168] "Request Body" body=""
	I1217 20:31:07.240437  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:07.240699  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:07.740382  414292 type.go:168] "Request Body" body=""
	I1217 20:31:07.740456  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:07.740780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:08.240694  414292 type.go:168] "Request Body" body=""
	I1217 20:31:08.240775  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:08.241125  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:08.241178  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:08.740933  414292 type.go:168] "Request Body" body=""
	I1217 20:31:08.741009  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:08.741272  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:09.240107  414292 type.go:168] "Request Body" body=""
	I1217 20:31:09.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:09.240509  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:09.740211  414292 type.go:168] "Request Body" body=""
	I1217 20:31:09.740317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:09.740653  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:10.240147  414292 type.go:168] "Request Body" body=""
	I1217 20:31:10.240221  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:10.240567  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:10.740283  414292 type.go:168] "Request Body" body=""
	I1217 20:31:10.740362  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:10.740720  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:10.740781  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:11.240308  414292 type.go:168] "Request Body" body=""
	I1217 20:31:11.240387  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:11.240742  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:11.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:31:11.740427  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:11.740686  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:12.240168  414292 type.go:168] "Request Body" body=""
	I1217 20:31:12.240265  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:12.240582  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:12.740305  414292 type.go:168] "Request Body" body=""
	I1217 20:31:12.740382  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:12.740717  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:13.240360  414292 type.go:168] "Request Body" body=""
	I1217 20:31:13.240435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:13.240753  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:13.240804  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:13.740494  414292 type.go:168] "Request Body" body=""
	I1217 20:31:13.740566  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:13.740865  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:14.240695  414292 type.go:168] "Request Body" body=""
	I1217 20:31:14.240775  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:14.241120  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:14.740883  414292 type.go:168] "Request Body" body=""
	I1217 20:31:14.740949  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:14.741207  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:15.241001  414292 type.go:168] "Request Body" body=""
	I1217 20:31:15.241086  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:15.241424  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:15.241480  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:15.740181  414292 type.go:168] "Request Body" body=""
	I1217 20:31:15.740286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:15.740606  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:16.240120  414292 type.go:168] "Request Body" body=""
	I1217 20:31:16.240190  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:16.240504  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:16.740165  414292 type.go:168] "Request Body" body=""
	I1217 20:31:16.740263  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:16.740588  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:17.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:31:17.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:17.240612  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:17.740176  414292 type.go:168] "Request Body" body=""
	I1217 20:31:17.740245  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:17.740595  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:17.740646  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:18.240578  414292 type.go:168] "Request Body" body=""
	I1217 20:31:18.240656  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:18.241010  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:18.740197  414292 type.go:168] "Request Body" body=""
	I1217 20:31:18.740299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:18.740622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:19.240589  414292 type.go:168] "Request Body" body=""
	I1217 20:31:19.240665  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:19.240973  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:19.740527  414292 type.go:168] "Request Body" body=""
	I1217 20:31:19.740602  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:19.740938  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:19.741004  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:20.240746  414292 type.go:168] "Request Body" body=""
	I1217 20:31:20.240826  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:20.241171  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:20.740847  414292 type.go:168] "Request Body" body=""
	I1217 20:31:20.740927  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:20.741205  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:21.240999  414292 type.go:168] "Request Body" body=""
	I1217 20:31:21.241077  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:21.241433  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:21.741076  414292 type.go:168] "Request Body" body=""
	I1217 20:31:21.741157  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:21.741476  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:21.741535  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:22.240148  414292 type.go:168] "Request Body" body=""
	I1217 20:31:22.240225  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:22.240595  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:22.740191  414292 type.go:168] "Request Body" body=""
	I1217 20:31:22.740288  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:22.740671  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:23.240361  414292 type.go:168] "Request Body" body=""
	I1217 20:31:23.240439  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:23.240766  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:23.740367  414292 type.go:168] "Request Body" body=""
	I1217 20:31:23.740436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:23.740727  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:24.240746  414292 type.go:168] "Request Body" body=""
	I1217 20:31:24.240834  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:24.241219  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:24.241275  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:24.740840  414292 type.go:168] "Request Body" body=""
	I1217 20:31:24.740915  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:24.741224  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:25.240994  414292 type.go:168] "Request Body" body=""
	I1217 20:31:25.241066  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:25.241325  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:25.741146  414292 type.go:168] "Request Body" body=""
	I1217 20:31:25.741238  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:25.741600  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:26.240173  414292 type.go:168] "Request Body" body=""
	I1217 20:31:26.240274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:26.240566  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:26.740222  414292 type.go:168] "Request Body" body=""
	I1217 20:31:26.740302  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:26.740563  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:26.740604  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:27.240174  414292 type.go:168] "Request Body" body=""
	I1217 20:31:27.240265  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:27.240586  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:27.740174  414292 type.go:168] "Request Body" body=""
	I1217 20:31:27.740267  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:27.740588  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:28.240508  414292 type.go:168] "Request Body" body=""
	I1217 20:31:28.240579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:28.240847  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:28.740580  414292 type.go:168] "Request Body" body=""
	I1217 20:31:28.740654  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:28.740974  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:28.741030  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:29.240927  414292 type.go:168] "Request Body" body=""
	I1217 20:31:29.241003  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:29.241345  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:29.740932  414292 type.go:168] "Request Body" body=""
	I1217 20:31:29.741003  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:29.741297  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:30.240066  414292 type.go:168] "Request Body" body=""
	I1217 20:31:30.240144  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:30.240477  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:30.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:31:30.740297  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:30.740655  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:31.240152  414292 type.go:168] "Request Body" body=""
	I1217 20:31:31.240227  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:31.240525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:31.240572  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:31.740177  414292 type.go:168] "Request Body" body=""
	I1217 20:31:31.740274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:31.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:32.240364  414292 type.go:168] "Request Body" body=""
	I1217 20:31:32.240441  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:32.240793  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:32.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:31:32.740429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:32.740739  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:33.240174  414292 type.go:168] "Request Body" body=""
	I1217 20:31:33.240265  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:33.240586  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:33.240635  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:33.740239  414292 type.go:168] "Request Body" body=""
	I1217 20:31:33.740336  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:33.740654  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:34.240597  414292 type.go:168] "Request Body" body=""
	I1217 20:31:34.240677  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:34.240945  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:34.740715  414292 type.go:168] "Request Body" body=""
	I1217 20:31:34.740794  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:34.741113  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:35.240931  414292 type.go:168] "Request Body" body=""
	I1217 20:31:35.241005  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:35.241378  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:35.241431  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:35.740086  414292 type.go:168] "Request Body" body=""
	I1217 20:31:35.740156  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:35.740458  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:36.240166  414292 type.go:168] "Request Body" body=""
	I1217 20:31:36.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:36.240589  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:36.740185  414292 type.go:168] "Request Body" body=""
	I1217 20:31:36.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:36.740585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:37.240233  414292 type.go:168] "Request Body" body=""
	I1217 20:31:37.240320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:37.240565  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:37.740177  414292 type.go:168] "Request Body" body=""
	I1217 20:31:37.740273  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:37.740567  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:37.740616  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:38.240625  414292 type.go:168] "Request Body" body=""
	I1217 20:31:38.240697  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:38.241070  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:38.740867  414292 type.go:168] "Request Body" body=""
	I1217 20:31:38.740936  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:38.741194  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:39.240063  414292 type.go:168] "Request Body" body=""
	I1217 20:31:39.240204  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:39.240542  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:39.740275  414292 type.go:168] "Request Body" body=""
	I1217 20:31:39.740351  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:39.740669  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:39.740728  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:40.240374  414292 type.go:168] "Request Body" body=""
	I1217 20:31:40.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:40.240701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:40.740211  414292 type.go:168] "Request Body" body=""
	I1217 20:31:40.740308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:40.740679  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:41.240409  414292 type.go:168] "Request Body" body=""
	I1217 20:31:41.240499  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:41.240858  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:41.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:31:41.740455  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:41.740717  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:41.740768  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:42.240201  414292 type.go:168] "Request Body" body=""
	I1217 20:31:42.240311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:42.240703  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:42.740571  414292 type.go:168] "Request Body" body=""
	I1217 20:31:42.740645  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:42.740967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:43.240727  414292 type.go:168] "Request Body" body=""
	I1217 20:31:43.240796  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:43.241050  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:43.740827  414292 type.go:168] "Request Body" body=""
	I1217 20:31:43.740901  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:43.741236  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:43.741293  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:44.241091  414292 type.go:168] "Request Body" body=""
	I1217 20:31:44.241176  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:44.241525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:44.740194  414292 type.go:168] "Request Body" body=""
	I1217 20:31:44.740280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:44.745967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=5
	I1217 20:31:45.240798  414292 type.go:168] "Request Body" body=""
	I1217 20:31:45.240901  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:45.241310  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:45.741143  414292 type.go:168] "Request Body" body=""
	I1217 20:31:45.741226  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:45.741583  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:45.741646  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:46.241073  414292 type.go:168] "Request Body" body=""
	I1217 20:31:46.241146  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:46.241399  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:46.740173  414292 type.go:168] "Request Body" body=""
	I1217 20:31:46.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:46.740602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:47.240190  414292 type.go:168] "Request Body" body=""
	I1217 20:31:47.240283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:47.240589  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:47.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:31:47.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:47.740649  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:48.240470  414292 type.go:168] "Request Body" body=""
	I1217 20:31:48.240554  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:48.241013  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:48.241064  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:48.740173  414292 type.go:168] "Request Body" body=""
	I1217 20:31:48.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:48.740603  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:49.240608  414292 type.go:168] "Request Body" body=""
	I1217 20:31:49.240675  414292 node_ready.go:38] duration metric: took 6m0.000721639s for node "functional-682596" to be "Ready" ...
	I1217 20:31:49.243794  414292 out.go:203] 
	W1217 20:31:49.246551  414292 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1217 20:31:49.246575  414292 out.go:285] * 
	W1217 20:31:49.249079  414292 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1217 20:31:49.251429  414292 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.001848112Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.001930616Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.002035519Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.002109095Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.002171430Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.002240592Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.002308310Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.002378620Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.002450473Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.002551603Z" level=info msg="Connect containerd service"
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.003013015Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.003712558Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.020658049Z" level=info msg="Start subscribing containerd event"
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.020740938Z" level=info msg="Start recovering state"
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.023140602Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.023368658Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.066191638Z" level=info msg="Start event monitor"
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.066246202Z" level=info msg="Start cni network conf syncer for default"
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.066258756Z" level=info msg="Start streaming server"
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.066268225Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.066275971Z" level=info msg="runtime interface starting up..."
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.066282896Z" level=info msg="starting plugins..."
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.066296787Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 17 20:25:47 functional-682596 systemd[1]: Started containerd.service - containerd container runtime.
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.067121641Z" level=info msg="containerd successfully booted in 0.091187s"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:31:51.086233    8510 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:31:51.086840    8510 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:31:51.088647    8510 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:31:51.089356    8510 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:31:51.090989    8510 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec17 17:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015536] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.514164] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034184] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.806183] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.649674] kauditd_printk_skb: 36 callbacks suppressed
	[Dec17 19:37] hrtimer: interrupt took 15014583 ns
	[Dec17 19:39] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:06] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:17] FS-Cache: Duplicate cookie detected
	[  +0.000767] FS-Cache: O-cookie c=00000031 [p=00000002 fl=222 nc=0 na=1]
	[  +0.001036] FS-Cache: O-cookie d=00000000b1f70094{9P.session} n=000000004124fba5
	[  +0.001177] FS-Cache: O-key=[10] '34323937353834383437'
	[  +0.000816] FS-Cache: N-cookie c=00000032 [p=00000002 fl=2 nc=0 na=1]
	[  +0.001043] FS-Cache: N-cookie d=00000000b1f70094{9P.session} n=000000009cece4cf
	[  +0.001160] FS-Cache: N-key=[10] '34323937353834383437'
	
	
	==> kernel <==
	 20:31:51 up  3:14,  0 user,  load average: 0.24, 0.31, 0.77
	Linux functional-682596 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 17 20:31:47 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:31:48 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 808.
	Dec 17 20:31:48 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:31:48 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:31:48 functional-682596 kubelet[8394]: E1217 20:31:48.533457    8394 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:31:48 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:31:48 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:31:49 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 809.
	Dec 17 20:31:49 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:31:49 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:31:49 functional-682596 kubelet[8400]: E1217 20:31:49.366130    8400 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:31:49 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:31:49 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:31:49 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 810.
	Dec 17 20:31:49 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:31:49 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:31:50 functional-682596 kubelet[8406]: E1217 20:31:50.053899    8406 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:31:50 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:31:50 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:31:50 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 811.
	Dec 17 20:31:50 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:31:50 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:31:50 functional-682596 kubelet[8434]: E1217 20:31:50.809013    8434 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:31:50 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:31:50 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596: exit status 2 (408.532947ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-682596" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart (368.20s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods (2.33s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods
functional_test.go:711: (dbg) Run:  kubectl --context functional-682596 get po -A
functional_test.go:711: (dbg) Non-zero exit: kubectl --context functional-682596 get po -A: exit status 1 (73.606435ms)

                                                
                                                
** stderr ** 
	E1217 20:31:52.321483  418236 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:31:52.323163  418236 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:31:52.324811  418236 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:31:52.326370  418236 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:31:52.327960  418236 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:713: failed to get kubectl pods: args "kubectl --context functional-682596 get po -A" : exit status 1
functional_test.go:717: expected stderr to be empty but got *"E1217 20:31:52.321483  418236 memcache.go:265] \"Unhandled Error\" err=\"couldn't get current server API group list: Get \\\"https://192.168.49.2:8441/api?timeout=32s\\\": dial tcp 192.168.49.2:8441: connect: connection refused\"\nE1217 20:31:52.323163  418236 memcache.go:265] \"Unhandled Error\" err=\"couldn't get current server API group list: Get \\\"https://192.168.49.2:8441/api?timeout=32s\\\": dial tcp 192.168.49.2:8441: connect: connection refused\"\nE1217 20:31:52.324811  418236 memcache.go:265] \"Unhandled Error\" err=\"couldn't get current server API group list: Get \\\"https://192.168.49.2:8441/api?timeout=32s\\\": dial tcp 192.168.49.2:8441: connect: connection refused\"\nE1217 20:31:52.326370  418236 memcache.go:265] \"Unhandled Error\" err=\"couldn't get current server API group list: Get \\\"https://192.168.49.2:8441/api?timeout=32s\\\": dial tcp 192.168.49.2:8441: connect: connection refused\"\nE1217 20:31:52.327960  418236 memc
ache.go:265] \"Unhandled Error\" err=\"couldn't get current server API group list: Get \\\"https://192.168.49.2:8441/api?timeout=32s\\\": dial tcp 192.168.49.2:8441: connect: connection refused\"\nThe connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?\n"*: args "kubectl --context functional-682596 get po -A"
functional_test.go:720: expected stdout to include *kube-system* but got *""*. args: "kubectl --context functional-682596 get po -A"
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-682596
helpers_test.go:244: (dbg) docker inspect functional-682596:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	        "Created": "2025-12-17T20:17:26.774929696Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 408854,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-17T20:17:26.844564666Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:2a6398fc76fc21dc0a77ac54600c2604c101bff52e66ecf65f88ec0f1a8cff2d",
	        "ResolvConfPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hostname",
	        "HostsPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hosts",
	        "LogPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77-json.log",
	        "Name": "/functional-682596",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-682596:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-682596",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	                "LowerDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268-init/diff:/var/lib/docker/overlay2/83c8e6311894730d80a5439b5d4991744e9cfa6d0015df9caca346d57baf92e8/diff",
	                "MergedDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/merged",
	                "UpperDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/diff",
	                "WorkDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-682596",
	                "Source": "/var/lib/docker/volumes/functional-682596/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-682596",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-682596",
	                "name.minikube.sigs.k8s.io": "functional-682596",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "8e0f8d4915f888f90df7adb000bd0e749885d304e33053e85751193487b627b9",
	            "SandboxKey": "/var/run/docker/netns/8e0f8d4915f8",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33163"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33164"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33167"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33165"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33166"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-682596": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "de:95:c1:d9:d4:32",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "9e66e4dbc8284f728f81715f37c51d8272e96fcac9fb378874c982b3077b6cc2",
	                    "EndpointID": "0db3c56cfb2be75c981ed53adcc07de7cd33db60d51c01b0e875c8d41cf02897",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-682596",
	                        "efc9468a7e55"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596: exit status 2 (313.074062ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh            │ functional-032730 ssh sudo cat /usr/share/ca-certificates/3694612.pem                                                                                           │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ ssh            │ functional-032730 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                        │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls                                                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ ssh            │ functional-032730 ssh sudo cat /etc/test/nested/copy/369461/hosts                                                                                               │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image load --daemon kicbase/echo-server:functional-032730 --alsologtostderr                                                                   │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls                                                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image save kicbase/echo-server:functional-032730 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ update-context │ functional-032730 update-context --alsologtostderr -v=2                                                                                                         │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ update-context │ functional-032730 update-context --alsologtostderr -v=2                                                                                                         │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image rm kicbase/echo-server:functional-032730 --alsologtostderr                                                                              │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ update-context │ functional-032730 update-context --alsologtostderr -v=2                                                                                                         │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls                                                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls                                                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image save --daemon kicbase/echo-server:functional-032730 --alsologtostderr                                                                   │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls --format yaml --alsologtostderr                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls --format short --alsologtostderr                                                                                                     │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls --format json --alsologtostderr                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls --format table --alsologtostderr                                                                                                     │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ ssh            │ functional-032730 ssh pgrep buildkitd                                                                                                                           │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │                     │
	│ image          │ functional-032730 image build -t localhost/my-image:functional-032730 testdata/build --alsologtostderr                                                          │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image          │ functional-032730 image ls                                                                                                                                      │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ delete         │ -p functional-032730                                                                                                                                            │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ start          │ -p functional-682596 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │                     │
	│ start          │ -p functional-682596 --alsologtostderr -v=8                                                                                                                     │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:25 UTC │                     │
	└────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/17 20:25:44
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1217 20:25:44.045489  414292 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:25:44.045686  414292 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:25:44.045714  414292 out.go:374] Setting ErrFile to fd 2...
	I1217 20:25:44.045733  414292 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:25:44.046029  414292 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:25:44.046470  414292 out.go:368] Setting JSON to false
	I1217 20:25:44.047409  414292 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":11289,"bootTime":1765991855,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:25:44.047515  414292 start.go:143] virtualization:  
	I1217 20:25:44.053027  414292 out.go:179] * [functional-682596] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1217 20:25:44.056011  414292 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 20:25:44.056093  414292 notify.go:221] Checking for updates...
	I1217 20:25:44.061883  414292 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:25:44.064833  414292 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:44.067589  414292 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:25:44.070446  414292 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 20:25:44.073380  414292 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 20:25:44.076968  414292 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:25:44.077128  414292 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:25:44.112208  414292 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:25:44.112455  414292 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:25:44.167112  414292 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 20:25:44.158029599 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:25:44.167209  414292 docker.go:319] overlay module found
	I1217 20:25:44.170171  414292 out.go:179] * Using the docker driver based on existing profile
	I1217 20:25:44.173086  414292 start.go:309] selected driver: docker
	I1217 20:25:44.173109  414292 start.go:927] validating driver "docker" against &{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableC
oreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:25:44.173214  414292 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 20:25:44.173330  414292 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:25:44.234258  414292 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 20:25:44.225129855 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:25:44.234785  414292 cni.go:84] Creating CNI manager for ""
	I1217 20:25:44.234848  414292 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:25:44.234909  414292 start.go:353] cluster config:
	{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath:
StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:25:44.238034  414292 out.go:179] * Starting "functional-682596" primary control-plane node in "functional-682596" cluster
	I1217 20:25:44.240853  414292 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1217 20:25:44.243760  414292 out.go:179] * Pulling base image v0.0.48-1765661130-22141 ...
	I1217 20:25:44.246713  414292 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:25:44.246768  414292 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1217 20:25:44.246782  414292 cache.go:65] Caching tarball of preloaded images
	I1217 20:25:44.246797  414292 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon
	I1217 20:25:44.246869  414292 preload.go:238] Found /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1217 20:25:44.246880  414292 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1217 20:25:44.246994  414292 profile.go:143] Saving config to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/config.json ...
	I1217 20:25:44.265764  414292 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon, skipping pull
	I1217 20:25:44.265789  414292 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 exists in daemon, skipping load
	I1217 20:25:44.265812  414292 cache.go:243] Successfully downloaded all kic artifacts
	I1217 20:25:44.265841  414292 start.go:360] acquireMachinesLock for functional-682596: {Name:mk49b95a4c72eb2d15a1ae0f35918a9843d0b3df Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1217 20:25:44.265903  414292 start.go:364] duration metric: took 36.013µs to acquireMachinesLock for "functional-682596"
	I1217 20:25:44.265927  414292 start.go:96] Skipping create...Using existing machine configuration
	I1217 20:25:44.265936  414292 fix.go:54] fixHost starting: 
	I1217 20:25:44.266187  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:44.282574  414292 fix.go:112] recreateIfNeeded on functional-682596: state=Running err=<nil>
	W1217 20:25:44.282603  414292 fix.go:138] unexpected machine state, will restart: <nil>
	I1217 20:25:44.285918  414292 out.go:252] * Updating the running docker "functional-682596" container ...
	I1217 20:25:44.285950  414292 machine.go:94] provisionDockerMachine start ...
	I1217 20:25:44.286031  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:44.302759  414292 main.go:143] libmachine: Using SSH client type: native
	I1217 20:25:44.303096  414292 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:25:44.303111  414292 main.go:143] libmachine: About to run SSH command:
	hostname
	I1217 20:25:44.431913  414292 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:25:44.431939  414292 ubuntu.go:182] provisioning hostname "functional-682596"
	I1217 20:25:44.432002  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:44.450770  414292 main.go:143] libmachine: Using SSH client type: native
	I1217 20:25:44.451117  414292 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:25:44.451136  414292 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-682596 && echo "functional-682596" | sudo tee /etc/hostname
	I1217 20:25:44.601580  414292 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:25:44.601732  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:44.619103  414292 main.go:143] libmachine: Using SSH client type: native
	I1217 20:25:44.619412  414292 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:25:44.619435  414292 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-682596' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-682596/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-682596' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1217 20:25:44.748545  414292 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1217 20:25:44.748571  414292 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21808-367595/.minikube CaCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21808-367595/.minikube}
	I1217 20:25:44.748593  414292 ubuntu.go:190] setting up certificates
	I1217 20:25:44.748603  414292 provision.go:84] configureAuth start
	I1217 20:25:44.748675  414292 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:25:44.766057  414292 provision.go:143] copyHostCerts
	I1217 20:25:44.766100  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem
	I1217 20:25:44.766141  414292 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem, removing ...
	I1217 20:25:44.766152  414292 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem
	I1217 20:25:44.766226  414292 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem (1082 bytes)
	I1217 20:25:44.766327  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem
	I1217 20:25:44.766347  414292 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem, removing ...
	I1217 20:25:44.766357  414292 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem
	I1217 20:25:44.766385  414292 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem (1123 bytes)
	I1217 20:25:44.766441  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem
	I1217 20:25:44.766461  414292 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem, removing ...
	I1217 20:25:44.766471  414292 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem
	I1217 20:25:44.766501  414292 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem (1679 bytes)
	I1217 20:25:44.766561  414292 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem org=jenkins.functional-682596 san=[127.0.0.1 192.168.49.2 functional-682596 localhost minikube]
	I1217 20:25:45.107844  414292 provision.go:177] copyRemoteCerts
	I1217 20:25:45.108657  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1217 20:25:45.108873  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.149674  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.277212  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1217 20:25:45.277284  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1217 20:25:45.298737  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1217 20:25:45.298796  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1217 20:25:45.320659  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1217 20:25:45.320720  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I1217 20:25:45.338755  414292 provision.go:87] duration metric: took 590.128101ms to configureAuth
	I1217 20:25:45.338800  414292 ubuntu.go:206] setting minikube options for container-runtime
	I1217 20:25:45.338978  414292 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:25:45.339040  414292 machine.go:97] duration metric: took 1.053082119s to provisionDockerMachine
	I1217 20:25:45.339048  414292 start.go:293] postStartSetup for "functional-682596" (driver="docker")
	I1217 20:25:45.339059  414292 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1217 20:25:45.339122  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1217 20:25:45.339165  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.356059  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.452345  414292 ssh_runner.go:195] Run: cat /etc/os-release
	I1217 20:25:45.455946  414292 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1217 20:25:45.455965  414292 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1217 20:25:45.455970  414292 command_runner.go:130] > VERSION_ID="12"
	I1217 20:25:45.455975  414292 command_runner.go:130] > VERSION="12 (bookworm)"
	I1217 20:25:45.455980  414292 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1217 20:25:45.455983  414292 command_runner.go:130] > ID=debian
	I1217 20:25:45.455989  414292 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1217 20:25:45.455994  414292 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1217 20:25:45.456008  414292 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1217 20:25:45.456046  414292 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1217 20:25:45.456062  414292 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1217 20:25:45.456073  414292 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/addons for local assets ...
	I1217 20:25:45.456130  414292 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/files for local assets ...
	I1217 20:25:45.456208  414292 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> 3694612.pem in /etc/ssl/certs
	I1217 20:25:45.456215  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> /etc/ssl/certs/3694612.pem
	I1217 20:25:45.456308  414292 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts -> hosts in /etc/test/nested/copy/369461
	I1217 20:25:45.456313  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts -> /etc/test/nested/copy/369461/hosts
	I1217 20:25:45.456356  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/369461
	I1217 20:25:45.464083  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:25:45.481460  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts --> /etc/test/nested/copy/369461/hosts (40 bytes)
	I1217 20:25:45.500420  414292 start.go:296] duration metric: took 161.357637ms for postStartSetup
	I1217 20:25:45.500542  414292 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1217 20:25:45.500615  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.517677  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.609195  414292 command_runner.go:130] > 18%
	I1217 20:25:45.609800  414292 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1217 20:25:45.614741  414292 command_runner.go:130] > 159G
	I1217 20:25:45.614774  414292 fix.go:56] duration metric: took 1.348835133s for fixHost
	I1217 20:25:45.614785  414292 start.go:83] releasing machines lock for "functional-682596", held for 1.348870218s
	I1217 20:25:45.614866  414292 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:25:45.631621  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:25:45.631685  414292 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:25:45.631702  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:25:45.631735  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:25:45.631767  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:25:45.631798  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:25:45.631848  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:25:45.631888  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.631907  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.631926  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem -> /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.631943  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:25:45.631995  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.649517  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.754346  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:25:45.772163  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:25:45.789636  414292 ssh_runner.go:195] Run: openssl version
	I1217 20:25:45.795706  414292 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1217 20:25:45.796203  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.803937  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:25:45.811516  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.815311  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.815389  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.815474  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.856132  414292 command_runner.go:130] > 3ec20f2e
	I1217 20:25:45.856705  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:25:45.864064  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.871519  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:25:45.879293  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.883196  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.883238  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.883306  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.924322  414292 command_runner.go:130] > b5213941
	I1217 20:25:45.924802  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:25:45.932259  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.939603  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:25:45.947311  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.950955  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.951320  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.951411  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.993968  414292 command_runner.go:130] > 51391683
	I1217 20:25:45.994167  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:25:46.002855  414292 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-certificates >/dev/null 2>&1 && sudo update-ca-certificates || true"
	I1217 20:25:46.007551  414292 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-trust >/dev/null 2>&1 && sudo update-ca-trust extract || true"
	I1217 20:25:46.011748  414292 ssh_runner.go:195] Run: cat /version.json
	I1217 20:25:46.011837  414292 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1217 20:25:46.016112  414292 command_runner.go:130] > {"iso_version": "v1.37.0-1765579389-22117", "kicbase_version": "v0.0.48-1765661130-22141", "minikube_version": "v1.37.0", "commit": "cbb33128a244032d08f8fc6e6c9f03b30f0da3e4"}
	I1217 20:25:46.018576  414292 ssh_runner.go:195] Run: systemctl --version
	I1217 20:25:46.126907  414292 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1217 20:25:46.127016  414292 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1217 20:25:46.127060  414292 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1217 20:25:46.127172  414292 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1217 20:25:46.131726  414292 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1217 20:25:46.131887  414292 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1217 20:25:46.131965  414292 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1217 20:25:46.140024  414292 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1217 20:25:46.140047  414292 start.go:496] detecting cgroup driver to use...
	I1217 20:25:46.140078  414292 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1217 20:25:46.140156  414292 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1217 20:25:46.155753  414292 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1217 20:25:46.168916  414292 docker.go:218] disabling cri-docker service (if available) ...
	I1217 20:25:46.169009  414292 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1217 20:25:46.184457  414292 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1217 20:25:46.197441  414292 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1217 20:25:46.302684  414292 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1217 20:25:46.421553  414292 docker.go:234] disabling docker service ...
	I1217 20:25:46.421621  414292 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1217 20:25:46.436823  414292 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1217 20:25:46.449890  414292 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1217 20:25:46.565021  414292 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1217 20:25:46.678341  414292 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1217 20:25:46.693104  414292 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1217 20:25:46.705993  414292 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1217 20:25:46.707385  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1217 20:25:46.716410  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1217 20:25:46.724756  414292 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1217 20:25:46.724876  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1217 20:25:46.733647  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:25:46.742030  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1217 20:25:46.750673  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:25:46.759312  414292 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1217 20:25:46.768595  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1217 20:25:46.777345  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1217 20:25:46.786196  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1217 20:25:46.795479  414292 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1217 20:25:46.802392  414292 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1217 20:25:46.803423  414292 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1217 20:25:46.811004  414292 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:25:46.926090  414292 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1217 20:25:47.068989  414292 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1217 20:25:47.069169  414292 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1217 20:25:47.073250  414292 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1217 20:25:47.073355  414292 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1217 20:25:47.073385  414292 command_runner.go:130] > Device: 0,72	Inode: 1618        Links: 1
	I1217 20:25:47.073441  414292 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1217 20:25:47.073470  414292 command_runner.go:130] > Access: 2025-12-17 20:25:47.016473578 +0000
	I1217 20:25:47.073512  414292 command_runner.go:130] > Modify: 2025-12-17 20:25:47.016473578 +0000
	I1217 20:25:47.073542  414292 command_runner.go:130] > Change: 2025-12-17 20:25:47.016473578 +0000
	I1217 20:25:47.073561  414292 command_runner.go:130] >  Birth: -
	I1217 20:25:47.073923  414292 start.go:564] Will wait 60s for crictl version
	I1217 20:25:47.074046  414292 ssh_runner.go:195] Run: which crictl
	I1217 20:25:47.077775  414292 command_runner.go:130] > /usr/local/bin/crictl
	I1217 20:25:47.078218  414292 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1217 20:25:47.104139  414292 command_runner.go:130] > Version:  0.1.0
	I1217 20:25:47.104225  414292 command_runner.go:130] > RuntimeName:  containerd
	I1217 20:25:47.104269  414292 command_runner.go:130] > RuntimeVersion:  v2.2.0
	I1217 20:25:47.104295  414292 command_runner.go:130] > RuntimeApiVersion:  v1
	I1217 20:25:47.106475  414292 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1217 20:25:47.106628  414292 ssh_runner.go:195] Run: containerd --version
	I1217 20:25:47.130403  414292 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1217 20:25:47.132698  414292 ssh_runner.go:195] Run: containerd --version
	I1217 20:25:47.152199  414292 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1217 20:25:47.159813  414292 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1217 20:25:47.162759  414292 cli_runner.go:164] Run: docker network inspect functional-682596 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1217 20:25:47.179237  414292 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1217 20:25:47.183476  414292 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1217 20:25:47.183701  414292 kubeadm.go:884] updating cluster {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false C
ustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1217 20:25:47.183825  414292 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:25:47.183890  414292 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:25:47.207538  414292 command_runner.go:130] > {
	I1217 20:25:47.207560  414292 command_runner.go:130] >   "images":  [
	I1217 20:25:47.207564  414292 command_runner.go:130] >     {
	I1217 20:25:47.207574  414292 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1217 20:25:47.207582  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207588  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1217 20:25:47.207591  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207595  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207607  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1217 20:25:47.207614  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207618  414292 command_runner.go:130] >       "size":  "40636774",
	I1217 20:25:47.207625  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207630  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207636  414292 command_runner.go:130] >     },
	I1217 20:25:47.207639  414292 command_runner.go:130] >     {
	I1217 20:25:47.207647  414292 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1217 20:25:47.207655  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207660  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1217 20:25:47.207664  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207668  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207678  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1217 20:25:47.207684  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207688  414292 command_runner.go:130] >       "size":  "8034419",
	I1217 20:25:47.207692  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207696  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207698  414292 command_runner.go:130] >     },
	I1217 20:25:47.207702  414292 command_runner.go:130] >     {
	I1217 20:25:47.207709  414292 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1217 20:25:47.207715  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207720  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1217 20:25:47.207735  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207747  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207756  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1217 20:25:47.207759  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207763  414292 command_runner.go:130] >       "size":  "21168808",
	I1217 20:25:47.207766  414292 command_runner.go:130] >       "username":  "nonroot",
	I1217 20:25:47.207770  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207773  414292 command_runner.go:130] >     },
	I1217 20:25:47.207776  414292 command_runner.go:130] >     {
	I1217 20:25:47.207783  414292 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1217 20:25:47.207787  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207791  414292 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1217 20:25:47.207795  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207798  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207806  414292 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1217 20:25:47.207809  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207813  414292 command_runner.go:130] >       "size":  "21749640",
	I1217 20:25:47.207817  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.207822  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.207826  414292 command_runner.go:130] >       },
	I1217 20:25:47.207833  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207837  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207842  414292 command_runner.go:130] >     },
	I1217 20:25:47.207846  414292 command_runner.go:130] >     {
	I1217 20:25:47.207853  414292 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1217 20:25:47.207859  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207865  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1217 20:25:47.207867  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207872  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207886  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1217 20:25:47.207890  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207894  414292 command_runner.go:130] >       "size":  "24692223",
	I1217 20:25:47.207897  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.207906  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.207915  414292 command_runner.go:130] >       },
	I1217 20:25:47.207928  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207932  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207934  414292 command_runner.go:130] >     },
	I1217 20:25:47.207938  414292 command_runner.go:130] >     {
	I1217 20:25:47.207947  414292 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1217 20:25:47.207955  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207961  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1217 20:25:47.207964  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207968  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207976  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1217 20:25:47.207982  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207986  414292 command_runner.go:130] >       "size":  "20672157",
	I1217 20:25:47.207990  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.207997  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.208001  414292 command_runner.go:130] >       },
	I1217 20:25:47.208020  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208028  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.208032  414292 command_runner.go:130] >     },
	I1217 20:25:47.208035  414292 command_runner.go:130] >     {
	I1217 20:25:47.208042  414292 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1217 20:25:47.208049  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.208054  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1217 20:25:47.208058  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208062  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.208069  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1217 20:25:47.208074  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208079  414292 command_runner.go:130] >       "size":  "22432301",
	I1217 20:25:47.208082  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208088  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.208091  414292 command_runner.go:130] >     },
	I1217 20:25:47.208097  414292 command_runner.go:130] >     {
	I1217 20:25:47.208104  414292 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1217 20:25:47.208114  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.208120  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1217 20:25:47.208123  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208128  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.208142  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1217 20:25:47.208146  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208149  414292 command_runner.go:130] >       "size":  "15405535",
	I1217 20:25:47.208153  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.208157  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.208163  414292 command_runner.go:130] >       },
	I1217 20:25:47.208168  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208173  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.208177  414292 command_runner.go:130] >     },
	I1217 20:25:47.208183  414292 command_runner.go:130] >     {
	I1217 20:25:47.208189  414292 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1217 20:25:47.208195  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.208200  414292 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1217 20:25:47.208203  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208207  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.208215  414292 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1217 20:25:47.208221  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208225  414292 command_runner.go:130] >       "size":  "267939",
	I1217 20:25:47.208229  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.208233  414292 command_runner.go:130] >         "value":  "65535"
	I1217 20:25:47.208237  414292 command_runner.go:130] >       },
	I1217 20:25:47.208240  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208245  414292 command_runner.go:130] >       "pinned":  true
	I1217 20:25:47.208339  414292 command_runner.go:130] >     }
	I1217 20:25:47.208342  414292 command_runner.go:130] >   ]
	I1217 20:25:47.208344  414292 command_runner.go:130] > }
	I1217 20:25:47.208525  414292 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:25:47.208539  414292 containerd.go:534] Images already preloaded, skipping extraction
	I1217 20:25:47.208601  414292 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:25:47.230634  414292 command_runner.go:130] > {
	I1217 20:25:47.230653  414292 command_runner.go:130] >   "images":  [
	I1217 20:25:47.230659  414292 command_runner.go:130] >     {
	I1217 20:25:47.230668  414292 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1217 20:25:47.230673  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230679  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1217 20:25:47.230683  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230687  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230696  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1217 20:25:47.230703  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230721  414292 command_runner.go:130] >       "size":  "40636774",
	I1217 20:25:47.230725  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.230729  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.230735  414292 command_runner.go:130] >     },
	I1217 20:25:47.230741  414292 command_runner.go:130] >     {
	I1217 20:25:47.230756  414292 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1217 20:25:47.230764  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230769  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1217 20:25:47.230773  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230786  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230798  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1217 20:25:47.230801  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230812  414292 command_runner.go:130] >       "size":  "8034419",
	I1217 20:25:47.230816  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.230819  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.230823  414292 command_runner.go:130] >     },
	I1217 20:25:47.230826  414292 command_runner.go:130] >     {
	I1217 20:25:47.230833  414292 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1217 20:25:47.230839  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230844  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1217 20:25:47.230857  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230888  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230900  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1217 20:25:47.230911  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230916  414292 command_runner.go:130] >       "size":  "21168808",
	I1217 20:25:47.230923  414292 command_runner.go:130] >       "username":  "nonroot",
	I1217 20:25:47.230927  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.230936  414292 command_runner.go:130] >     },
	I1217 20:25:47.230939  414292 command_runner.go:130] >     {
	I1217 20:25:47.230946  414292 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1217 20:25:47.230950  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230954  414292 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1217 20:25:47.230960  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230964  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230972  414292 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1217 20:25:47.230984  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230988  414292 command_runner.go:130] >       "size":  "21749640",
	I1217 20:25:47.230991  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.230995  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.230998  414292 command_runner.go:130] >       },
	I1217 20:25:47.231003  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231009  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231012  414292 command_runner.go:130] >     },
	I1217 20:25:47.231018  414292 command_runner.go:130] >     {
	I1217 20:25:47.231024  414292 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1217 20:25:47.231037  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231042  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1217 20:25:47.231045  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231050  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231063  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1217 20:25:47.231067  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231071  414292 command_runner.go:130] >       "size":  "24692223",
	I1217 20:25:47.231074  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231087  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.231093  414292 command_runner.go:130] >       },
	I1217 20:25:47.231097  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231111  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231117  414292 command_runner.go:130] >     },
	I1217 20:25:47.231125  414292 command_runner.go:130] >     {
	I1217 20:25:47.231132  414292 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1217 20:25:47.231138  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231144  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1217 20:25:47.231151  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231155  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231164  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1217 20:25:47.231168  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231172  414292 command_runner.go:130] >       "size":  "20672157",
	I1217 20:25:47.231178  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231194  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.231200  414292 command_runner.go:130] >       },
	I1217 20:25:47.231204  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231208  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231211  414292 command_runner.go:130] >     },
	I1217 20:25:47.231214  414292 command_runner.go:130] >     {
	I1217 20:25:47.231223  414292 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1217 20:25:47.231238  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231246  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1217 20:25:47.231250  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231254  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231264  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1217 20:25:47.231276  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231280  414292 command_runner.go:130] >       "size":  "22432301",
	I1217 20:25:47.231284  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231288  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231291  414292 command_runner.go:130] >     },
	I1217 20:25:47.231294  414292 command_runner.go:130] >     {
	I1217 20:25:47.231309  414292 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1217 20:25:47.231317  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231323  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1217 20:25:47.231333  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231337  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231347  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1217 20:25:47.231359  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231363  414292 command_runner.go:130] >       "size":  "15405535",
	I1217 20:25:47.231366  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231370  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.231373  414292 command_runner.go:130] >       },
	I1217 20:25:47.231379  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231392  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231395  414292 command_runner.go:130] >     },
	I1217 20:25:47.231405  414292 command_runner.go:130] >     {
	I1217 20:25:47.231412  414292 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1217 20:25:47.231418  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231423  414292 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1217 20:25:47.231428  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231437  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231445  414292 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1217 20:25:47.231448  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231452  414292 command_runner.go:130] >       "size":  "267939",
	I1217 20:25:47.231455  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231459  414292 command_runner.go:130] >         "value":  "65535"
	I1217 20:25:47.231462  414292 command_runner.go:130] >       },
	I1217 20:25:47.231466  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231469  414292 command_runner.go:130] >       "pinned":  true
	I1217 20:25:47.231473  414292 command_runner.go:130] >     }
	I1217 20:25:47.231479  414292 command_runner.go:130] >   ]
	I1217 20:25:47.231482  414292 command_runner.go:130] > }
	I1217 20:25:47.233897  414292 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:25:47.233919  414292 cache_images.go:86] Images are preloaded, skipping loading
	I1217 20:25:47.233928  414292 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1217 20:25:47.234041  414292 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-682596 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1217 20:25:47.234107  414292 ssh_runner.go:195] Run: sudo crictl info
	I1217 20:25:47.256786  414292 command_runner.go:130] > {
	I1217 20:25:47.256808  414292 command_runner.go:130] >   "cniconfig": {
	I1217 20:25:47.256814  414292 command_runner.go:130] >     "Networks": [
	I1217 20:25:47.256818  414292 command_runner.go:130] >       {
	I1217 20:25:47.256823  414292 command_runner.go:130] >         "Config": {
	I1217 20:25:47.256827  414292 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1217 20:25:47.256833  414292 command_runner.go:130] >           "Name": "cni-loopback",
	I1217 20:25:47.256837  414292 command_runner.go:130] >           "Plugins": [
	I1217 20:25:47.256840  414292 command_runner.go:130] >             {
	I1217 20:25:47.256846  414292 command_runner.go:130] >               "Network": {
	I1217 20:25:47.256851  414292 command_runner.go:130] >                 "ipam": {},
	I1217 20:25:47.256863  414292 command_runner.go:130] >                 "type": "loopback"
	I1217 20:25:47.256875  414292 command_runner.go:130] >               },
	I1217 20:25:47.256880  414292 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1217 20:25:47.256883  414292 command_runner.go:130] >             }
	I1217 20:25:47.256887  414292 command_runner.go:130] >           ],
	I1217 20:25:47.256896  414292 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1217 20:25:47.256900  414292 command_runner.go:130] >         },
	I1217 20:25:47.256911  414292 command_runner.go:130] >         "IFName": "lo"
	I1217 20:25:47.256917  414292 command_runner.go:130] >       }
	I1217 20:25:47.256920  414292 command_runner.go:130] >     ],
	I1217 20:25:47.256924  414292 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1217 20:25:47.256927  414292 command_runner.go:130] >     "PluginDirs": [
	I1217 20:25:47.256932  414292 command_runner.go:130] >       "/opt/cni/bin"
	I1217 20:25:47.256941  414292 command_runner.go:130] >     ],
	I1217 20:25:47.256945  414292 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1217 20:25:47.256949  414292 command_runner.go:130] >     "Prefix": "eth"
	I1217 20:25:47.256952  414292 command_runner.go:130] >   },
	I1217 20:25:47.256957  414292 command_runner.go:130] >   "config": {
	I1217 20:25:47.256962  414292 command_runner.go:130] >     "cdiSpecDirs": [
	I1217 20:25:47.256965  414292 command_runner.go:130] >       "/etc/cdi",
	I1217 20:25:47.256969  414292 command_runner.go:130] >       "/var/run/cdi"
	I1217 20:25:47.256977  414292 command_runner.go:130] >     ],
	I1217 20:25:47.256985  414292 command_runner.go:130] >     "cni": {
	I1217 20:25:47.256991  414292 command_runner.go:130] >       "binDir": "",
	I1217 20:25:47.256995  414292 command_runner.go:130] >       "binDirs": [
	I1217 20:25:47.256999  414292 command_runner.go:130] >         "/opt/cni/bin"
	I1217 20:25:47.257003  414292 command_runner.go:130] >       ],
	I1217 20:25:47.257008  414292 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1217 20:25:47.257025  414292 command_runner.go:130] >       "confTemplate": "",
	I1217 20:25:47.257029  414292 command_runner.go:130] >       "ipPref": "",
	I1217 20:25:47.257033  414292 command_runner.go:130] >       "maxConfNum": 1,
	I1217 20:25:47.257040  414292 command_runner.go:130] >       "setupSerially": false,
	I1217 20:25:47.257044  414292 command_runner.go:130] >       "useInternalLoopback": false
	I1217 20:25:47.257049  414292 command_runner.go:130] >     },
	I1217 20:25:47.257057  414292 command_runner.go:130] >     "containerd": {
	I1217 20:25:47.257061  414292 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1217 20:25:47.257069  414292 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1217 20:25:47.257076  414292 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1217 20:25:47.257080  414292 command_runner.go:130] >       "runtimes": {
	I1217 20:25:47.257084  414292 command_runner.go:130] >         "runc": {
	I1217 20:25:47.257097  414292 command_runner.go:130] >           "ContainerAnnotations": null,
	I1217 20:25:47.257102  414292 command_runner.go:130] >           "PodAnnotations": null,
	I1217 20:25:47.257106  414292 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1217 20:25:47.257111  414292 command_runner.go:130] >           "cgroupWritable": false,
	I1217 20:25:47.257119  414292 command_runner.go:130] >           "cniConfDir": "",
	I1217 20:25:47.257123  414292 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1217 20:25:47.257127  414292 command_runner.go:130] >           "io_type": "",
	I1217 20:25:47.257133  414292 command_runner.go:130] >           "options": {
	I1217 20:25:47.257139  414292 command_runner.go:130] >             "BinaryName": "",
	I1217 20:25:47.257143  414292 command_runner.go:130] >             "CriuImagePath": "",
	I1217 20:25:47.257148  414292 command_runner.go:130] >             "CriuWorkPath": "",
	I1217 20:25:47.257154  414292 command_runner.go:130] >             "IoGid": 0,
	I1217 20:25:47.257158  414292 command_runner.go:130] >             "IoUid": 0,
	I1217 20:25:47.257162  414292 command_runner.go:130] >             "NoNewKeyring": false,
	I1217 20:25:47.257174  414292 command_runner.go:130] >             "Root": "",
	I1217 20:25:47.257186  414292 command_runner.go:130] >             "ShimCgroup": "",
	I1217 20:25:47.257193  414292 command_runner.go:130] >             "SystemdCgroup": false
	I1217 20:25:47.257196  414292 command_runner.go:130] >           },
	I1217 20:25:47.257206  414292 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1217 20:25:47.257213  414292 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1217 20:25:47.257217  414292 command_runner.go:130] >           "runtimePath": "",
	I1217 20:25:47.257224  414292 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1217 20:25:47.257229  414292 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1217 20:25:47.257233  414292 command_runner.go:130] >           "snapshotter": ""
	I1217 20:25:47.257238  414292 command_runner.go:130] >         }
	I1217 20:25:47.257241  414292 command_runner.go:130] >       }
	I1217 20:25:47.257246  414292 command_runner.go:130] >     },
	I1217 20:25:47.257261  414292 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1217 20:25:47.257269  414292 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1217 20:25:47.257274  414292 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1217 20:25:47.257280  414292 command_runner.go:130] >     "disableApparmor": false,
	I1217 20:25:47.257290  414292 command_runner.go:130] >     "disableHugetlbController": true,
	I1217 20:25:47.257294  414292 command_runner.go:130] >     "disableProcMount": false,
	I1217 20:25:47.257299  414292 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1217 20:25:47.257303  414292 command_runner.go:130] >     "enableCDI": true,
	I1217 20:25:47.257309  414292 command_runner.go:130] >     "enableSelinux": false,
	I1217 20:25:47.257313  414292 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1217 20:25:47.257318  414292 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1217 20:25:47.257325  414292 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1217 20:25:47.257331  414292 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1217 20:25:47.257336  414292 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1217 20:25:47.257340  414292 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1217 20:25:47.257353  414292 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1217 20:25:47.257358  414292 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1217 20:25:47.257362  414292 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1217 20:25:47.257368  414292 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1217 20:25:47.257375  414292 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1217 20:25:47.257379  414292 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1217 20:25:47.257386  414292 command_runner.go:130] >   },
	I1217 20:25:47.257390  414292 command_runner.go:130] >   "features": {
	I1217 20:25:47.257396  414292 command_runner.go:130] >     "supplemental_groups_policy": true
	I1217 20:25:47.257399  414292 command_runner.go:130] >   },
	I1217 20:25:47.257403  414292 command_runner.go:130] >   "golang": "go1.24.9",
	I1217 20:25:47.257416  414292 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1217 20:25:47.257429  414292 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1217 20:25:47.257433  414292 command_runner.go:130] >   "runtimeHandlers": [
	I1217 20:25:47.257436  414292 command_runner.go:130] >     {
	I1217 20:25:47.257447  414292 command_runner.go:130] >       "features": {
	I1217 20:25:47.257451  414292 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1217 20:25:47.257455  414292 command_runner.go:130] >         "user_namespaces": true
	I1217 20:25:47.257460  414292 command_runner.go:130] >       }
	I1217 20:25:47.257463  414292 command_runner.go:130] >     },
	I1217 20:25:47.257469  414292 command_runner.go:130] >     {
	I1217 20:25:47.257473  414292 command_runner.go:130] >       "features": {
	I1217 20:25:47.257477  414292 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1217 20:25:47.257481  414292 command_runner.go:130] >         "user_namespaces": true
	I1217 20:25:47.257484  414292 command_runner.go:130] >       },
	I1217 20:25:47.257488  414292 command_runner.go:130] >       "name": "runc"
	I1217 20:25:47.257494  414292 command_runner.go:130] >     }
	I1217 20:25:47.257497  414292 command_runner.go:130] >   ],
	I1217 20:25:47.257502  414292 command_runner.go:130] >   "status": {
	I1217 20:25:47.257506  414292 command_runner.go:130] >     "conditions": [
	I1217 20:25:47.257509  414292 command_runner.go:130] >       {
	I1217 20:25:47.257514  414292 command_runner.go:130] >         "message": "",
	I1217 20:25:47.257526  414292 command_runner.go:130] >         "reason": "",
	I1217 20:25:47.257530  414292 command_runner.go:130] >         "status": true,
	I1217 20:25:47.257536  414292 command_runner.go:130] >         "type": "RuntimeReady"
	I1217 20:25:47.257539  414292 command_runner.go:130] >       },
	I1217 20:25:47.257543  414292 command_runner.go:130] >       {
	I1217 20:25:47.257549  414292 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1217 20:25:47.257554  414292 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1217 20:25:47.257563  414292 command_runner.go:130] >         "status": false,
	I1217 20:25:47.257568  414292 command_runner.go:130] >         "type": "NetworkReady"
	I1217 20:25:47.257574  414292 command_runner.go:130] >       },
	I1217 20:25:47.257577  414292 command_runner.go:130] >       {
	I1217 20:25:47.257599  414292 command_runner.go:130] >         "message": "{\"io.containerd.deprecation/cgroup-v1\":\"The support for cgroup v1 is deprecated since containerd v2.2 and will be removed by no later than May 2029. Upgrade the host to use cgroup v2.\"}",
	I1217 20:25:47.257609  414292 command_runner.go:130] >         "reason": "ContainerdHasDeprecationWarnings",
	I1217 20:25:47.257615  414292 command_runner.go:130] >         "status": false,
	I1217 20:25:47.257620  414292 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1217 20:25:47.257626  414292 command_runner.go:130] >       }
	I1217 20:25:47.257629  414292 command_runner.go:130] >     ]
	I1217 20:25:47.257631  414292 command_runner.go:130] >   }
	I1217 20:25:47.257634  414292 command_runner.go:130] > }
	I1217 20:25:47.259959  414292 cni.go:84] Creating CNI manager for ""
	I1217 20:25:47.259981  414292 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:25:47.259991  414292 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1217 20:25:47.260020  414292 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-682596 NodeName:functional-682596 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Stati
cPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1217 20:25:47.260142  414292 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-682596"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1217 20:25:47.260216  414292 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1217 20:25:47.267498  414292 command_runner.go:130] > kubeadm
	I1217 20:25:47.267517  414292 command_runner.go:130] > kubectl
	I1217 20:25:47.267520  414292 command_runner.go:130] > kubelet
	I1217 20:25:47.268462  414292 binaries.go:51] Found k8s binaries, skipping transfer
	I1217 20:25:47.268563  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1217 20:25:47.276438  414292 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1217 20:25:47.289778  414292 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1217 20:25:47.303155  414292 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2235 bytes)
	I1217 20:25:47.315864  414292 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1217 20:25:47.319319  414292 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1217 20:25:47.319605  414292 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:25:47.441462  414292 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1217 20:25:47.463080  414292 certs.go:69] Setting up /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596 for IP: 192.168.49.2
	I1217 20:25:47.463150  414292 certs.go:195] generating shared ca certs ...
	I1217 20:25:47.463190  414292 certs.go:227] acquiring lock for ca certs: {Name:mk528c7ee25f2f3d78de33f266a77f908cb2a9d0 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:47.463362  414292 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key
	I1217 20:25:47.463461  414292 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key
	I1217 20:25:47.463501  414292 certs.go:257] generating profile certs ...
	I1217 20:25:47.463662  414292 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key
	I1217 20:25:47.463774  414292 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key.0c30bf8d
	I1217 20:25:47.463860  414292 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key
	I1217 20:25:47.463894  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1217 20:25:47.463938  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1217 20:25:47.463977  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1217 20:25:47.464005  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1217 20:25:47.464049  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1217 20:25:47.464079  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1217 20:25:47.464117  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1217 20:25:47.464151  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1217 20:25:47.464241  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:25:47.464342  414292 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:25:47.464377  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:25:47.464421  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:25:47.464488  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:25:47.464541  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:25:47.464629  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:25:47.464693  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.464733  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.464771  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem -> /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.469220  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1217 20:25:47.495389  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1217 20:25:47.516308  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1217 20:25:47.535144  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1217 20:25:47.552466  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1217 20:25:47.570909  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1217 20:25:47.588173  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1217 20:25:47.606011  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1217 20:25:47.623433  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:25:47.640520  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:25:47.657751  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:25:47.675695  414292 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1217 20:25:47.688487  414292 ssh_runner.go:195] Run: openssl version
	I1217 20:25:47.694560  414292 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1217 20:25:47.694946  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.702368  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:25:47.710124  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.713826  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.713858  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.713917  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.754917  414292 command_runner.go:130] > 3ec20f2e
	I1217 20:25:47.755445  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:25:47.763008  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.770327  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:25:47.778030  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.782014  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.782042  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.782099  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.822920  414292 command_runner.go:130] > b5213941
	I1217 20:25:47.823058  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:25:47.830582  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.837906  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:25:47.845640  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.849463  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.849531  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.849600  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.890040  414292 command_runner.go:130] > 51391683
	I1217 20:25:47.890555  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:25:47.898150  414292 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1217 20:25:47.901790  414292 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1217 20:25:47.901872  414292 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1217 20:25:47.901887  414292 command_runner.go:130] > Device: 259,1	Inode: 1060771     Links: 1
	I1217 20:25:47.901895  414292 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1217 20:25:47.901902  414292 command_runner.go:130] > Access: 2025-12-17 20:21:41.033930957 +0000
	I1217 20:25:47.901907  414292 command_runner.go:130] > Modify: 2025-12-17 20:17:35.731490416 +0000
	I1217 20:25:47.901912  414292 command_runner.go:130] > Change: 2025-12-17 20:17:35.731490416 +0000
	I1217 20:25:47.901921  414292 command_runner.go:130] >  Birth: 2025-12-17 20:17:35.731490416 +0000
	I1217 20:25:47.901988  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1217 20:25:47.942293  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:47.942780  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1217 20:25:47.983019  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:47.983513  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1217 20:25:48.024341  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.024837  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1217 20:25:48.065771  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.066190  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1217 20:25:48.107223  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.107692  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1217 20:25:48.148374  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.148810  414292 kubeadm.go:401] StartCluster: {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Cust
omQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:25:48.148912  414292 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1217 20:25:48.148983  414292 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1217 20:25:48.175983  414292 cri.go:89] found id: ""
	I1217 20:25:48.176056  414292 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1217 20:25:48.182939  414292 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1217 20:25:48.182960  414292 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1217 20:25:48.182967  414292 command_runner.go:130] > /var/lib/minikube/etcd:
	I1217 20:25:48.183854  414292 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1217 20:25:48.183910  414292 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1217 20:25:48.183977  414292 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1217 20:25:48.191197  414292 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:25:48.191635  414292 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-682596" does not appear in /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.191740  414292 kubeconfig.go:62] /home/jenkins/minikube-integration/21808-367595/kubeconfig needs updating (will repair): [kubeconfig missing "functional-682596" cluster setting kubeconfig missing "functional-682596" context setting]
	I1217 20:25:48.192034  414292 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/kubeconfig: {Name:mk68b516071fc5d9da0842bf56ff4d318cea3c03 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:48.192565  414292 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.192744  414292 kapi.go:59] client config for functional-682596: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt", KeyFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key", CAFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil),
NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1217 20:25:48.193250  414292 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1217 20:25:48.193273  414292 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1217 20:25:48.193281  414292 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1217 20:25:48.193286  414292 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1217 20:25:48.193293  414292 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1217 20:25:48.193576  414292 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1217 20:25:48.193650  414292 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1217 20:25:48.201269  414292 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1217 20:25:48.201338  414292 kubeadm.go:602] duration metric: took 17.417602ms to restartPrimaryControlPlane
	I1217 20:25:48.201355  414292 kubeadm.go:403] duration metric: took 52.552362ms to StartCluster
	I1217 20:25:48.201370  414292 settings.go:142] acquiring lock: {Name:mkec67bf414aabef990098a6cc4910956f0d3622 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:48.201429  414292 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.202007  414292 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/kubeconfig: {Name:mk68b516071fc5d9da0842bf56ff4d318cea3c03 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:48.202208  414292 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1217 20:25:48.202539  414292 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:25:48.202581  414292 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1217 20:25:48.202699  414292 addons.go:70] Setting storage-provisioner=true in profile "functional-682596"
	I1217 20:25:48.202717  414292 addons.go:239] Setting addon storage-provisioner=true in "functional-682596"
	I1217 20:25:48.202742  414292 host.go:66] Checking if "functional-682596" exists ...
	I1217 20:25:48.202770  414292 addons.go:70] Setting default-storageclass=true in profile "functional-682596"
	I1217 20:25:48.202806  414292 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-682596"
	I1217 20:25:48.203165  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:48.203224  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:48.208687  414292 out.go:179] * Verifying Kubernetes components...
	I1217 20:25:48.211692  414292 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:25:48.230383  414292 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1217 20:25:48.233339  414292 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:48.233361  414292 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1217 20:25:48.233423  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:48.236813  414292 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.236975  414292 kapi.go:59] client config for functional-682596: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt", KeyFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key", CAFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil),
NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1217 20:25:48.237238  414292 addons.go:239] Setting addon default-storageclass=true in "functional-682596"
	I1217 20:25:48.237267  414292 host.go:66] Checking if "functional-682596" exists ...
	I1217 20:25:48.237711  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:48.262897  414292 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:48.262919  414292 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1217 20:25:48.262996  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:48.269972  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:48.294767  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:48.418586  414292 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1217 20:25:48.450623  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:48.465245  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:49.239916  414292 node_ready.go:35] waiting up to 6m0s for node "functional-682596" to be "Ready" ...
	I1217 20:25:49.240030  414292 type.go:168] "Request Body" body=""
	I1217 20:25:49.240095  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:49.240342  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.240376  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240403  414292 retry.go:31] will retry after 252.350229ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240440  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.240459  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240479  414292 retry.go:31] will retry after 321.821783ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240555  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:49.493033  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:49.547929  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.551638  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.551667  414292 retry.go:31] will retry after 328.531722ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.562869  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:49.621023  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.625124  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.625209  414292 retry.go:31] will retry after 442.103425ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.740481  414292 type.go:168] "Request Body" body=""
	I1217 20:25:49.740559  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:49.740872  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:49.881274  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:49.942102  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.945784  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.945890  414292 retry.go:31] will retry after 409.243705ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.068055  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:50.127397  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:50.131721  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.131759  414292 retry.go:31] will retry after 566.560423ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.241000  414292 type.go:168] "Request Body" body=""
	I1217 20:25:50.241077  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:50.241406  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:50.355732  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:50.414970  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:50.419857  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.419893  414292 retry.go:31] will retry after 763.212709ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.699479  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:50.741041  414292 type.go:168] "Request Body" body=""
	I1217 20:25:50.741134  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:50.741465  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:50.776772  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:50.776815  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.776839  414292 retry.go:31] will retry after 1.24877806s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:51.183473  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:51.240182  414292 type.go:168] "Request Body" body=""
	I1217 20:25:51.240277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:51.240545  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:51.240594  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:51.251909  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:51.255943  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:51.255983  414292 retry.go:31] will retry after 1.271740821s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:51.740532  414292 type.go:168] "Request Body" body=""
	I1217 20:25:51.740649  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:51.740974  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:52.026483  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:52.095052  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:52.095119  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.095140  414292 retry.go:31] will retry after 1.58694383s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.240356  414292 type.go:168] "Request Body" body=""
	I1217 20:25:52.240430  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:52.240682  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:52.528382  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:52.586445  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:52.590032  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.590066  414292 retry.go:31] will retry after 1.445188932s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.740386  414292 type.go:168] "Request Body" body=""
	I1217 20:25:52.740463  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:52.740818  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:53.240182  414292 type.go:168] "Request Body" body=""
	I1217 20:25:53.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:53.240604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:53.240660  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:53.682297  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:53.740043  414292 type.go:168] "Request Body" body=""
	I1217 20:25:53.740108  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:53.740352  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:53.743851  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:53.743882  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:53.743900  414292 retry.go:31] will retry after 2.69671946s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:54.036496  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:54.096053  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:54.096099  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:54.096122  414292 retry.go:31] will retry after 2.925706415s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:54.240487  414292 type.go:168] "Request Body" body=""
	I1217 20:25:54.240571  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:54.240903  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:54.740656  414292 type.go:168] "Request Body" body=""
	I1217 20:25:54.740752  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:54.741104  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:55.240849  414292 type.go:168] "Request Body" body=""
	I1217 20:25:55.240918  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:55.241169  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:55.241222  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:55.741059  414292 type.go:168] "Request Body" body=""
	I1217 20:25:55.741137  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:55.741444  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:56.240196  414292 type.go:168] "Request Body" body=""
	I1217 20:25:56.240318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:56.240645  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:56.440979  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:56.500702  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:56.500749  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:56.500767  414292 retry.go:31] will retry after 1.84810195s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:56.740117  414292 type.go:168] "Request Body" body=""
	I1217 20:25:56.740201  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:56.740503  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:57.023057  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:57.082954  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:57.083001  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:57.083020  414292 retry.go:31] will retry after 3.223759279s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:57.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:25:57.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:57.240558  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:57.740268  414292 type.go:168] "Request Body" body=""
	I1217 20:25:57.740347  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:57.740685  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:57.740756  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:58.240571  414292 type.go:168] "Request Body" body=""
	I1217 20:25:58.240660  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:58.240952  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:58.349268  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:58.403710  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:58.407286  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:58.407317  414292 retry.go:31] will retry after 3.305771044s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:58.740858  414292 type.go:168] "Request Body" body=""
	I1217 20:25:58.740936  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:58.741275  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:59.240111  414292 type.go:168] "Request Body" body=""
	I1217 20:25:59.240220  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:59.240560  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:59.740145  414292 type.go:168] "Request Body" body=""
	I1217 20:25:59.740223  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:59.740492  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:00.240307  414292 type.go:168] "Request Body" body=""
	I1217 20:26:00.240425  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:00.240806  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:00.240857  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:00.307216  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:00.372358  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:00.376526  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:00.376564  414292 retry.go:31] will retry after 8.003704403s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:00.740135  414292 type.go:168] "Request Body" body=""
	I1217 20:26:00.740216  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:00.740543  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:01.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:26:01.240281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:01.240535  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:01.713237  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:01.740945  414292 type.go:168] "Request Body" body=""
	I1217 20:26:01.741019  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:01.741278  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:01.769053  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:01.772711  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:01.772742  414292 retry.go:31] will retry after 3.267552643s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:02.240198  414292 type.go:168] "Request Body" body=""
	I1217 20:26:02.240302  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:02.240604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:02.740266  414292 type.go:168] "Request Body" body=""
	I1217 20:26:02.740336  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:02.740681  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:02.740769  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:03.240210  414292 type.go:168] "Request Body" body=""
	I1217 20:26:03.240299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:03.240622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:03.740228  414292 type.go:168] "Request Body" body=""
	I1217 20:26:03.740320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:03.740637  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:04.240516  414292 type.go:168] "Request Body" body=""
	I1217 20:26:04.240588  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:04.240943  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:04.740734  414292 type.go:168] "Request Body" body=""
	I1217 20:26:04.740811  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:04.741190  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:04.741246  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:05.040756  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:05.102503  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:05.102552  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:05.102572  414292 retry.go:31] will retry after 12.344413157s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:05.240841  414292 type.go:168] "Request Body" body=""
	I1217 20:26:05.240913  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:05.241244  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:05.740855  414292 type.go:168] "Request Body" body=""
	I1217 20:26:05.740930  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:05.741188  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:06.241036  414292 type.go:168] "Request Body" body=""
	I1217 20:26:06.241119  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:06.241411  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:06.740129  414292 type.go:168] "Request Body" body=""
	I1217 20:26:06.740212  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:06.740571  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:07.240279  414292 type.go:168] "Request Body" body=""
	I1217 20:26:07.240353  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:07.240608  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:07.240657  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:07.740195  414292 type.go:168] "Request Body" body=""
	I1217 20:26:07.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:07.740591  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:08.240525  414292 type.go:168] "Request Body" body=""
	I1217 20:26:08.240599  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:08.240914  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:08.381383  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:08.435212  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:08.439369  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:08.439410  414292 retry.go:31] will retry after 8.892819822s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:08.740968  414292 type.go:168] "Request Body" body=""
	I1217 20:26:08.741065  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:08.741390  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:09.240148  414292 type.go:168] "Request Body" body=""
	I1217 20:26:09.240230  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:09.240616  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:09.740331  414292 type.go:168] "Request Body" body=""
	I1217 20:26:09.740408  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:09.740742  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:09.740801  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:10.240362  414292 type.go:168] "Request Body" body=""
	I1217 20:26:10.240435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:10.240780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:10.740200  414292 type.go:168] "Request Body" body=""
	I1217 20:26:10.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:10.740646  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:11.240216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:11.240308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:11.240651  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:11.740164  414292 type.go:168] "Request Body" body=""
	I1217 20:26:11.740235  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:11.740510  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:12.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:26:12.240283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:12.240625  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:12.240683  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:12.740202  414292 type.go:168] "Request Body" body=""
	I1217 20:26:12.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:12.740622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:13.240171  414292 type.go:168] "Request Body" body=""
	I1217 20:26:13.240266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:13.240526  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:13.740189  414292 type.go:168] "Request Body" body=""
	I1217 20:26:13.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:13.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:14.240635  414292 type.go:168] "Request Body" body=""
	I1217 20:26:14.240715  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:14.241059  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:14.241125  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:14.741075  414292 type.go:168] "Request Body" body=""
	I1217 20:26:14.741149  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:14.741406  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:15.240079  414292 type.go:168] "Request Body" body=""
	I1217 20:26:15.240155  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:15.240494  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:15.740230  414292 type.go:168] "Request Body" body=""
	I1217 20:26:15.740334  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:15.740683  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:16.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:26:16.240429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:16.240695  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:16.740381  414292 type.go:168] "Request Body" body=""
	I1217 20:26:16.740456  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:16.740780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:16.740834  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:17.240356  414292 type.go:168] "Request Body" body=""
	I1217 20:26:17.240434  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:17.240777  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:17.333063  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:17.388410  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:17.391967  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.391995  414292 retry.go:31] will retry after 13.113728844s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.447345  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:17.505124  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:17.505163  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.505182  414292 retry.go:31] will retry after 11.452403849s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.740560  414292 type.go:168] "Request Body" body=""
	I1217 20:26:17.740629  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:17.740885  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:18.240553  414292 type.go:168] "Request Body" body=""
	I1217 20:26:18.240633  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:18.240967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:18.740512  414292 type.go:168] "Request Body" body=""
	I1217 20:26:18.740589  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:18.740904  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:18.740955  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:19.240885  414292 type.go:168] "Request Body" body=""
	I1217 20:26:19.240962  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:19.241213  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:19.741015  414292 type.go:168] "Request Body" body=""
	I1217 20:26:19.741087  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:19.741404  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:20.240182  414292 type.go:168] "Request Body" body=""
	I1217 20:26:20.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:20.240627  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:20.740108  414292 type.go:168] "Request Body" body=""
	I1217 20:26:20.740183  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:20.740453  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:21.240216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:21.240318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:21.240698  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:21.240754  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:21.740194  414292 type.go:168] "Request Body" body=""
	I1217 20:26:21.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:21.740628  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:22.240933  414292 type.go:168] "Request Body" body=""
	I1217 20:26:22.241005  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:22.241257  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:22.741110  414292 type.go:168] "Request Body" body=""
	I1217 20:26:22.741224  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:22.741585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:23.240288  414292 type.go:168] "Request Body" body=""
	I1217 20:26:23.240369  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:23.240662  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:23.741096  414292 type.go:168] "Request Body" body=""
	I1217 20:26:23.741162  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:23.741447  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:23.741492  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:24.240108  414292 type.go:168] "Request Body" body=""
	I1217 20:26:24.240184  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:24.240503  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:24.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:24.740312  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:24.740605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:25.240111  414292 type.go:168] "Request Body" body=""
	I1217 20:26:25.240206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:25.240472  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:25.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:25.740317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:25.740610  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:26.240350  414292 type.go:168] "Request Body" body=""
	I1217 20:26:26.240423  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:26.240796  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:26.240856  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:26.740354  414292 type.go:168] "Request Body" body=""
	I1217 20:26:26.740433  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:26.740693  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:27.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:26:27.240285  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:27.240571  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:27.740306  414292 type.go:168] "Request Body" body=""
	I1217 20:26:27.740387  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:27.740718  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:28.240518  414292 type.go:168] "Request Body" body=""
	I1217 20:26:28.240588  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:28.240860  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:28.240905  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:28.740699  414292 type.go:168] "Request Body" body=""
	I1217 20:26:28.740776  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:28.741110  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:28.958534  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:29.018842  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:29.024509  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:29.024543  414292 retry.go:31] will retry after 28.006345092s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:29.241080  414292 type.go:168] "Request Body" body=""
	I1217 20:26:29.241159  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:29.241493  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:29.740997  414292 type.go:168] "Request Body" body=""
	I1217 20:26:29.741065  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:29.741356  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:30.241045  414292 type.go:168] "Request Body" body=""
	I1217 20:26:30.241120  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:30.241435  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:30.241493  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:30.505938  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:30.574101  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:30.574147  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:30.574166  414292 retry.go:31] will retry after 31.982210322s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:30.740490  414292 type.go:168] "Request Body" body=""
	I1217 20:26:30.740579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:30.740933  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:31.240692  414292 type.go:168] "Request Body" body=""
	I1217 20:26:31.240768  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:31.248432  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=7
	I1217 20:26:31.740179  414292 type.go:168] "Request Body" body=""
	I1217 20:26:31.740287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:31.740647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:32.240201  414292 type.go:168] "Request Body" body=""
	I1217 20:26:32.240304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:32.240630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:32.740084  414292 type.go:168] "Request Body" body=""
	I1217 20:26:32.740156  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:32.740461  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:32.740520  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:33.240200  414292 type.go:168] "Request Body" body=""
	I1217 20:26:33.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:33.240625  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:33.740223  414292 type.go:168] "Request Body" body=""
	I1217 20:26:33.740319  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:33.740635  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:34.240634  414292 type.go:168] "Request Body" body=""
	I1217 20:26:34.240711  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:34.241019  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:34.740715  414292 type.go:168] "Request Body" body=""
	I1217 20:26:34.740788  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:34.741122  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:34.741178  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:35.240958  414292 type.go:168] "Request Body" body=""
	I1217 20:26:35.241039  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:35.241368  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:35.740050  414292 type.go:168] "Request Body" body=""
	I1217 20:26:35.740126  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:35.740407  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:36.240170  414292 type.go:168] "Request Body" body=""
	I1217 20:26:36.240271  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:36.240623  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:36.740177  414292 type.go:168] "Request Body" body=""
	I1217 20:26:36.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:36.740609  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:37.240114  414292 type.go:168] "Request Body" body=""
	I1217 20:26:37.240213  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:37.240481  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:37.240522  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:37.740165  414292 type.go:168] "Request Body" body=""
	I1217 20:26:37.740239  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:37.740593  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:38.240448  414292 type.go:168] "Request Body" body=""
	I1217 20:26:38.240537  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:38.240850  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:38.740388  414292 type.go:168] "Request Body" body=""
	I1217 20:26:38.740461  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:38.740786  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:39.240612  414292 type.go:168] "Request Body" body=""
	I1217 20:26:39.240699  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:39.241070  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:39.241129  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:39.740905  414292 type.go:168] "Request Body" body=""
	I1217 20:26:39.740985  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:39.741321  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:40.240050  414292 type.go:168] "Request Body" body=""
	I1217 20:26:40.240123  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:40.240466  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:40.740172  414292 type.go:168] "Request Body" body=""
	I1217 20:26:40.740275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:40.740632  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:41.240336  414292 type.go:168] "Request Body" body=""
	I1217 20:26:41.240409  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:41.240744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:41.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:26:41.740423  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:41.740730  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:41.740787  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:42.240202  414292 type.go:168] "Request Body" body=""
	I1217 20:26:42.240311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:42.240673  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:42.740189  414292 type.go:168] "Request Body" body=""
	I1217 20:26:42.740284  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:42.740581  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:43.240122  414292 type.go:168] "Request Body" body=""
	I1217 20:26:43.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:43.240458  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:43.740180  414292 type.go:168] "Request Body" body=""
	I1217 20:26:43.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:43.740597  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:44.240188  414292 type.go:168] "Request Body" body=""
	I1217 20:26:44.240284  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:44.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:44.240672  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:44.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:26:44.740427  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:44.740701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:45.240268  414292 type.go:168] "Request Body" body=""
	I1217 20:26:45.240370  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:45.240810  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:45.740385  414292 type.go:168] "Request Body" body=""
	I1217 20:26:45.740481  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:45.740887  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:46.240669  414292 type.go:168] "Request Body" body=""
	I1217 20:26:46.240750  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:46.241005  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:46.241046  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:46.740832  414292 type.go:168] "Request Body" body=""
	I1217 20:26:46.740907  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:46.741230  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:47.241112  414292 type.go:168] "Request Body" body=""
	I1217 20:26:47.241195  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:47.241535  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:47.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:47.740311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:47.740564  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:48.240565  414292 type.go:168] "Request Body" body=""
	I1217 20:26:48.240648  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:48.240997  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:48.740812  414292 type.go:168] "Request Body" body=""
	I1217 20:26:48.740893  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:48.741250  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:48.741305  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:49.241092  414292 type.go:168] "Request Body" body=""
	I1217 20:26:49.241159  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:49.241410  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:49.740094  414292 type.go:168] "Request Body" body=""
	I1217 20:26:49.740170  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:49.740483  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:50.240219  414292 type.go:168] "Request Body" body=""
	I1217 20:26:50.240334  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:50.240696  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:50.740130  414292 type.go:168] "Request Body" body=""
	I1217 20:26:50.740210  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:50.740538  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:51.240180  414292 type.go:168] "Request Body" body=""
	I1217 20:26:51.240279  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:51.240607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:51.240658  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:51.740209  414292 type.go:168] "Request Body" body=""
	I1217 20:26:51.740323  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:51.740662  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:52.240355  414292 type.go:168] "Request Body" body=""
	I1217 20:26:52.240429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:52.240693  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:52.740381  414292 type.go:168] "Request Body" body=""
	I1217 20:26:52.740464  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:52.740824  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:53.240545  414292 type.go:168] "Request Body" body=""
	I1217 20:26:53.240622  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:53.240967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:53.241022  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:53.740774  414292 type.go:168] "Request Body" body=""
	I1217 20:26:53.740855  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:53.741192  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:54.240965  414292 type.go:168] "Request Body" body=""
	I1217 20:26:54.241045  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:54.241396  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:54.740180  414292 type.go:168] "Request Body" body=""
	I1217 20:26:54.740269  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:54.740570  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:55.240136  414292 type.go:168] "Request Body" body=""
	I1217 20:26:55.240208  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:55.240531  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:55.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:55.740305  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:55.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:55.740689  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:56.240227  414292 type.go:168] "Request Body" body=""
	I1217 20:26:56.240326  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:56.240664  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:56.740128  414292 type.go:168] "Request Body" body=""
	I1217 20:26:56.740207  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:56.740534  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:57.031083  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:57.091368  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:57.091412  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:57.091434  414292 retry.go:31] will retry after 46.71155063s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:57.240719  414292 type.go:168] "Request Body" body=""
	I1217 20:26:57.240799  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:57.241113  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:57.740782  414292 type.go:168] "Request Body" body=""
	I1217 20:26:57.740862  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:57.741143  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:57.741191  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:58.240610  414292 type.go:168] "Request Body" body=""
	I1217 20:26:58.240678  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:58.240925  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:58.740701  414292 type.go:168] "Request Body" body=""
	I1217 20:26:58.740774  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:58.741126  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:59.241090  414292 type.go:168] "Request Body" body=""
	I1217 20:26:59.241163  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:59.241466  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:59.740862  414292 type.go:168] "Request Body" body=""
	I1217 20:26:59.740930  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:59.741174  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:59.741215  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:00.241177  414292 type.go:168] "Request Body" body=""
	I1217 20:27:00.241266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:00.241643  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:00.740463  414292 type.go:168] "Request Body" body=""
	I1217 20:27:00.740543  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:00.740888  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:01.240688  414292 type.go:168] "Request Body" body=""
	I1217 20:27:01.240764  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:01.241063  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:01.740881  414292 type.go:168] "Request Body" body=""
	I1217 20:27:01.740989  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:01.741337  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:01.741388  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:02.240100  414292 type.go:168] "Request Body" body=""
	I1217 20:27:02.240176  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:02.240556  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:02.557038  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:27:02.616976  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:02.620493  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:27:02.620531  414292 retry.go:31] will retry after 42.622456402s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:27:02.740802  414292 type.go:168] "Request Body" body=""
	I1217 20:27:02.740875  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:02.741140  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:03.240977  414292 type.go:168] "Request Body" body=""
	I1217 20:27:03.241074  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:03.241392  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:03.740139  414292 type.go:168] "Request Body" body=""
	I1217 20:27:03.740236  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:03.740586  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:04.240156  414292 type.go:168] "Request Body" body=""
	I1217 20:27:04.240238  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:04.240536  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:04.240579  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:04.740272  414292 type.go:168] "Request Body" body=""
	I1217 20:27:04.740346  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:04.740738  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:05.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:27:05.240287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:05.240617  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:05.740277  414292 type.go:168] "Request Body" body=""
	I1217 20:27:05.740351  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:05.740613  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:06.240183  414292 type.go:168] "Request Body" body=""
	I1217 20:27:06.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:06.240615  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:06.240675  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:06.740224  414292 type.go:168] "Request Body" body=""
	I1217 20:27:06.740355  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:06.740779  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:07.240397  414292 type.go:168] "Request Body" body=""
	I1217 20:27:07.240474  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:07.240744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:07.740218  414292 type.go:168] "Request Body" body=""
	I1217 20:27:07.740311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:07.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:08.240435  414292 type.go:168] "Request Body" body=""
	I1217 20:27:08.240511  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:08.240866  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:08.240920  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:08.740675  414292 type.go:168] "Request Body" body=""
	I1217 20:27:08.740748  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:08.741014  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:09.241042  414292 type.go:168] "Request Body" body=""
	I1217 20:27:09.241128  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:09.241481  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:09.740097  414292 type.go:168] "Request Body" body=""
	I1217 20:27:09.740192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:09.740525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:10.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:27:10.240295  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:10.240559  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:10.740356  414292 type.go:168] "Request Body" body=""
	I1217 20:27:10.740433  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:10.740782  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:10.740855  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:11.240590  414292 type.go:168] "Request Body" body=""
	I1217 20:27:11.240674  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:11.241071  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:11.740833  414292 type.go:168] "Request Body" body=""
	I1217 20:27:11.740915  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:11.741195  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:12.241068  414292 type.go:168] "Request Body" body=""
	I1217 20:27:12.241147  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:12.241476  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:12.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:27:12.740301  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:12.740663  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:13.240124  414292 type.go:168] "Request Body" body=""
	I1217 20:27:13.240197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:13.240538  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:13.240601  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:13.740193  414292 type.go:168] "Request Body" body=""
	I1217 20:27:13.740291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:13.740596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:14.240723  414292 type.go:168] "Request Body" body=""
	I1217 20:27:14.240797  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:14.241150  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:14.740979  414292 type.go:168] "Request Body" body=""
	I1217 20:27:14.741059  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:14.741325  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:15.240063  414292 type.go:168] "Request Body" body=""
	I1217 20:27:15.240146  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:15.240479  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:15.740243  414292 type.go:168] "Request Body" body=""
	I1217 20:27:15.740346  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:15.740681  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:15.740748  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:16.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:27:16.240421  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:16.240686  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:16.740158  414292 type.go:168] "Request Body" body=""
	I1217 20:27:16.740236  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:16.740588  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:17.240299  414292 type.go:168] "Request Body" body=""
	I1217 20:27:17.240374  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:17.240705  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:17.740352  414292 type.go:168] "Request Body" body=""
	I1217 20:27:17.740427  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:17.740687  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:18.240626  414292 type.go:168] "Request Body" body=""
	I1217 20:27:18.240717  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:18.241052  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:18.241112  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:18.740882  414292 type.go:168] "Request Body" body=""
	I1217 20:27:18.740963  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:18.741275  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:19.240070  414292 type.go:168] "Request Body" body=""
	I1217 20:27:19.240154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:19.240424  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:19.740179  414292 type.go:168] "Request Body" body=""
	I1217 20:27:19.740319  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:19.740655  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:20.240358  414292 type.go:168] "Request Body" body=""
	I1217 20:27:20.240437  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:20.240772  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:20.740370  414292 type.go:168] "Request Body" body=""
	I1217 20:27:20.740436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:20.740701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:20.740740  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:21.240460  414292 type.go:168] "Request Body" body=""
	I1217 20:27:21.240550  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:21.240867  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:21.740170  414292 type.go:168] "Request Body" body=""
	I1217 20:27:21.740290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:21.740607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:22.240286  414292 type.go:168] "Request Body" body=""
	I1217 20:27:22.240355  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:22.240607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:22.740330  414292 type.go:168] "Request Body" body=""
	I1217 20:27:22.740422  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:22.740746  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:22.740812  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:23.240211  414292 type.go:168] "Request Body" body=""
	I1217 20:27:23.240304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:23.240668  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:23.740210  414292 type.go:168] "Request Body" body=""
	I1217 20:27:23.740305  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:23.740601  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:24.240106  414292 type.go:168] "Request Body" body=""
	I1217 20:27:24.240203  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:24.240577  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:24.740327  414292 type.go:168] "Request Body" body=""
	I1217 20:27:24.740411  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:24.740719  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:25.240377  414292 type.go:168] "Request Body" body=""
	I1217 20:27:25.240459  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:25.240721  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:25.240761  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:25.740193  414292 type.go:168] "Request Body" body=""
	I1217 20:27:25.740294  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:25.740646  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:26.240192  414292 type.go:168] "Request Body" body=""
	I1217 20:27:26.240307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:26.240648  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:26.741005  414292 type.go:168] "Request Body" body=""
	I1217 20:27:26.741084  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:26.741343  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:27.241172  414292 type.go:168] "Request Body" body=""
	I1217 20:27:27.241259  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:27.241612  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:27.241675  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:27.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:27:27.740292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:27.740610  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:28.240549  414292 type.go:168] "Request Body" body=""
	I1217 20:27:28.240616  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:28.240862  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:28.740178  414292 type.go:168] "Request Body" body=""
	I1217 20:27:28.740278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:28.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:29.240202  414292 type.go:168] "Request Body" body=""
	I1217 20:27:29.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:29.240606  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:29.740117  414292 type.go:168] "Request Body" body=""
	I1217 20:27:29.740197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:29.740469  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:29.740520  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:30.240204  414292 type.go:168] "Request Body" body=""
	I1217 20:27:30.240310  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:30.240594  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:30.740299  414292 type.go:168] "Request Body" body=""
	I1217 20:27:30.740381  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:30.740724  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:31.240991  414292 type.go:168] "Request Body" body=""
	I1217 20:27:31.241062  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:31.241311  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:31.741040  414292 type.go:168] "Request Body" body=""
	I1217 20:27:31.741119  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:31.741431  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:31.741486  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:32.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:27:32.240308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:32.240623  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:32.740134  414292 type.go:168] "Request Body" body=""
	I1217 20:27:32.740210  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:32.740528  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:33.240226  414292 type.go:168] "Request Body" body=""
	I1217 20:27:33.240327  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:33.240647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:33.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:27:33.740286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:33.740611  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:34.240530  414292 type.go:168] "Request Body" body=""
	I1217 20:27:34.240599  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:34.240875  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:34.240919  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:34.740753  414292 type.go:168] "Request Body" body=""
	I1217 20:27:34.740829  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:34.741167  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:35.240992  414292 type.go:168] "Request Body" body=""
	I1217 20:27:35.241075  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:35.241368  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:35.740071  414292 type.go:168] "Request Body" body=""
	I1217 20:27:35.740149  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:35.740453  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:36.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:27:36.240282  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:36.240595  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:36.740308  414292 type.go:168] "Request Body" body=""
	I1217 20:27:36.740384  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:36.740734  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:36.740791  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:37.240354  414292 type.go:168] "Request Body" body=""
	I1217 20:27:37.240426  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:37.240679  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:37.740203  414292 type.go:168] "Request Body" body=""
	I1217 20:27:37.740290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:37.740605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:38.240504  414292 type.go:168] "Request Body" body=""
	I1217 20:27:38.240579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:38.240898  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:38.740390  414292 type.go:168] "Request Body" body=""
	I1217 20:27:38.740487  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:38.740891  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:38.740956  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:39.240903  414292 type.go:168] "Request Body" body=""
	I1217 20:27:39.240984  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:39.241256  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:39.741008  414292 type.go:168] "Request Body" body=""
	I1217 20:27:39.741080  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:39.741404  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:40.241008  414292 type.go:168] "Request Body" body=""
	I1217 20:27:40.241078  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:40.241381  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:40.740111  414292 type.go:168] "Request Body" body=""
	I1217 20:27:40.740212  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:40.740522  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:41.240166  414292 type.go:168] "Request Body" body=""
	I1217 20:27:41.240266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:41.240622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:41.240677  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:41.740082  414292 type.go:168] "Request Body" body=""
	I1217 20:27:41.740154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:41.740465  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:42.240219  414292 type.go:168] "Request Body" body=""
	I1217 20:27:42.240342  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:42.240648  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:42.740376  414292 type.go:168] "Request Body" body=""
	I1217 20:27:42.740469  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:42.740797  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:43.240389  414292 type.go:168] "Request Body" body=""
	I1217 20:27:43.240470  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:43.240795  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:43.240842  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:43.740348  414292 type.go:168] "Request Body" body=""
	I1217 20:27:43.740422  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:43.740975  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:43.803299  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:27:43.859759  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:43.863204  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:43.863297  414292 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1217 20:27:44.241025  414292 type.go:168] "Request Body" body=""
	I1217 20:27:44.241121  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:44.241455  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:44.740148  414292 type.go:168] "Request Body" body=""
	I1217 20:27:44.740221  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:44.740492  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:45.240574  414292 type.go:168] "Request Body" body=""
	I1217 20:27:45.240742  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:45.242019  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=1
	W1217 20:27:45.242186  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:45.244153  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:27:45.319007  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:45.319122  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:45.319226  414292 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1217 20:27:45.322350  414292 out.go:179] * Enabled addons: 
	I1217 20:27:45.325857  414292 addons.go:530] duration metric: took 1m57.123269017s for enable addons: enabled=[]
	I1217 20:27:45.740623  414292 type.go:168] "Request Body" body=""
	I1217 20:27:45.740707  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:45.741048  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:46.240887  414292 type.go:168] "Request Body" body=""
	I1217 20:27:46.240956  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:46.241257  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:46.741061  414292 type.go:168] "Request Body" body=""
	I1217 20:27:46.741135  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:46.741496  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:47.240179  414292 type.go:168] "Request Body" body=""
	I1217 20:27:47.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:47.240598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:47.740172  414292 type.go:168] "Request Body" body=""
	I1217 20:27:47.745570  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:47.746779  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:47.746914  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:48.240563  414292 type.go:168] "Request Body" body=""
	I1217 20:27:48.240642  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:48.240990  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:48.740796  414292 type.go:168] "Request Body" body=""
	I1217 20:27:48.740881  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:48.741219  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:49.240326  414292 type.go:168] "Request Body" body=""
	I1217 20:27:49.240395  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:49.240661  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:49.740181  414292 type.go:168] "Request Body" body=""
	I1217 20:27:49.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:49.740594  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:50.240191  414292 type.go:168] "Request Body" body=""
	I1217 20:27:50.240286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:50.240605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:50.240662  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:50.741072  414292 type.go:168] "Request Body" body=""
	I1217 20:27:50.741139  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:50.741398  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:51.240102  414292 type.go:168] "Request Body" body=""
	I1217 20:27:51.240183  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:51.240525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:51.740138  414292 type.go:168] "Request Body" body=""
	I1217 20:27:51.740212  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:51.740573  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:52.240123  414292 type.go:168] "Request Body" body=""
	I1217 20:27:52.240196  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:52.240556  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:52.740167  414292 type.go:168] "Request Body" body=""
	I1217 20:27:52.740295  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:52.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:52.740683  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:53.240363  414292 type.go:168] "Request Body" body=""
	I1217 20:27:53.240445  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:53.240776  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:53.740345  414292 type.go:168] "Request Body" body=""
	I1217 20:27:53.740444  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:53.740732  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:54.240671  414292 type.go:168] "Request Body" body=""
	I1217 20:27:54.240743  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:54.241055  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:54.740834  414292 type.go:168] "Request Body" body=""
	I1217 20:27:54.740911  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:54.741241  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:54.741301  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:55.241020  414292 type.go:168] "Request Body" body=""
	I1217 20:27:55.241088  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:55.241340  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:55.740115  414292 type.go:168] "Request Body" body=""
	I1217 20:27:55.740205  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:55.740573  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:56.240196  414292 type.go:168] "Request Body" body=""
	I1217 20:27:56.240298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:56.240618  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:56.740302  414292 type.go:168] "Request Body" body=""
	I1217 20:27:56.740375  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:56.740674  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:57.240163  414292 type.go:168] "Request Body" body=""
	I1217 20:27:57.240242  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:57.240572  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:57.240625  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:57.740326  414292 type.go:168] "Request Body" body=""
	I1217 20:27:57.740414  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:57.740758  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:58.240606  414292 type.go:168] "Request Body" body=""
	I1217 20:27:58.240676  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:58.240930  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:58.740703  414292 type.go:168] "Request Body" body=""
	I1217 20:27:58.740776  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:58.741128  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:59.240930  414292 type.go:168] "Request Body" body=""
	I1217 20:27:59.241008  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:59.241330  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:59.241390  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:59.741097  414292 type.go:168] "Request Body" body=""
	I1217 20:27:59.741170  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:59.741444  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:00.240277  414292 type.go:168] "Request Body" body=""
	I1217 20:28:00.240360  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:00.240691  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:00.740372  414292 type.go:168] "Request Body" body=""
	I1217 20:28:00.740453  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:00.740814  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:01.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:28:01.240448  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:01.240754  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:01.740452  414292 type.go:168] "Request Body" body=""
	I1217 20:28:01.740539  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:01.740931  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:01.740984  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:02.240802  414292 type.go:168] "Request Body" body=""
	I1217 20:28:02.240878  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:02.241186  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:02.740889  414292 type.go:168] "Request Body" body=""
	I1217 20:28:02.740958  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:02.741285  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:03.241130  414292 type.go:168] "Request Body" body=""
	I1217 20:28:03.241210  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:03.241568  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:03.740205  414292 type.go:168] "Request Body" body=""
	I1217 20:28:03.740305  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:03.740675  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:04.240435  414292 type.go:168] "Request Body" body=""
	I1217 20:28:04.240501  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:04.240759  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:04.240799  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:04.740206  414292 type.go:168] "Request Body" body=""
	I1217 20:28:04.740298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:04.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:05.240365  414292 type.go:168] "Request Body" body=""
	I1217 20:28:05.240442  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:05.240770  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:05.740360  414292 type.go:168] "Request Body" body=""
	I1217 20:28:05.740435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:05.740725  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:06.240330  414292 type.go:168] "Request Body" body=""
	I1217 20:28:06.240409  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:06.240732  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:06.740442  414292 type.go:168] "Request Body" body=""
	I1217 20:28:06.740525  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:06.740853  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:06.740914  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:07.240349  414292 type.go:168] "Request Body" body=""
	I1217 20:28:07.240423  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:07.240678  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:07.740162  414292 type.go:168] "Request Body" body=""
	I1217 20:28:07.740243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:07.740585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:08.240506  414292 type.go:168] "Request Body" body=""
	I1217 20:28:08.240587  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:08.240906  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:08.740708  414292 type.go:168] "Request Body" body=""
	I1217 20:28:08.740811  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:08.741159  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:08.741223  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:09.241091  414292 type.go:168] "Request Body" body=""
	I1217 20:28:09.241169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:09.241495  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:09.740346  414292 type.go:168] "Request Body" body=""
	I1217 20:28:09.740424  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:09.740766  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:10.240377  414292 type.go:168] "Request Body" body=""
	I1217 20:28:10.240453  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:10.240750  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:10.740215  414292 type.go:168] "Request Body" body=""
	I1217 20:28:10.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:10.740678  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:11.240243  414292 type.go:168] "Request Body" body=""
	I1217 20:28:11.240342  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:11.240712  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:11.240767  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:11.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:28:11.740446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:11.740716  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:12.240420  414292 type.go:168] "Request Body" body=""
	I1217 20:28:12.240502  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:12.240833  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:12.740560  414292 type.go:168] "Request Body" body=""
	I1217 20:28:12.740634  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:12.740952  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:13.240720  414292 type.go:168] "Request Body" body=""
	I1217 20:28:13.240805  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:13.241066  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:13.241122  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:13.740900  414292 type.go:168] "Request Body" body=""
	I1217 20:28:13.740983  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:13.741356  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:14.240146  414292 type.go:168] "Request Body" body=""
	I1217 20:28:14.240277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:14.240625  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:14.740140  414292 type.go:168] "Request Body" body=""
	I1217 20:28:14.740209  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:14.740516  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:15.240203  414292 type.go:168] "Request Body" body=""
	I1217 20:28:15.240304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:15.240633  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:15.740361  414292 type.go:168] "Request Body" body=""
	I1217 20:28:15.740447  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:15.740780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:15.740840  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:16.240359  414292 type.go:168] "Request Body" body=""
	I1217 20:28:16.240436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:16.240823  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:16.740505  414292 type.go:168] "Request Body" body=""
	I1217 20:28:16.740583  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:16.740911  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:17.240693  414292 type.go:168] "Request Body" body=""
	I1217 20:28:17.240772  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:17.241095  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:17.740839  414292 type.go:168] "Request Body" body=""
	I1217 20:28:17.740925  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:17.741192  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:17.741241  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:18.241106  414292 type.go:168] "Request Body" body=""
	I1217 20:28:18.241186  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:18.241520  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:18.740223  414292 type.go:168] "Request Body" body=""
	I1217 20:28:18.740344  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:18.740693  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:19.240508  414292 type.go:168] "Request Body" body=""
	I1217 20:28:19.240581  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:19.240841  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:19.740191  414292 type.go:168] "Request Body" body=""
	I1217 20:28:19.740291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:19.740610  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:20.240363  414292 type.go:168] "Request Body" body=""
	I1217 20:28:20.240438  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:20.240765  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:20.240823  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:20.740348  414292 type.go:168] "Request Body" body=""
	I1217 20:28:20.740426  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:20.740691  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:21.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:28:21.240298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:21.240640  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:21.740375  414292 type.go:168] "Request Body" body=""
	I1217 20:28:21.740465  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:21.740819  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:22.240416  414292 type.go:168] "Request Body" body=""
	I1217 20:28:22.240484  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:22.240741  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:22.740166  414292 type.go:168] "Request Body" body=""
	I1217 20:28:22.740262  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:22.740592  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:22.740654  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:23.240360  414292 type.go:168] "Request Body" body=""
	I1217 20:28:23.240441  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:23.240798  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:23.740367  414292 type.go:168] "Request Body" body=""
	I1217 20:28:23.740434  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:23.740759  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:24.240649  414292 type.go:168] "Request Body" body=""
	I1217 20:28:24.240730  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:24.241071  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:24.740869  414292 type.go:168] "Request Body" body=""
	I1217 20:28:24.740949  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:24.741289  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:24.741351  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:25.241053  414292 type.go:168] "Request Body" body=""
	I1217 20:28:25.241120  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:25.241378  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:25.740052  414292 type.go:168] "Request Body" body=""
	I1217 20:28:25.740126  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:25.740478  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:26.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:28:26.240294  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:26.240602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:26.740121  414292 type.go:168] "Request Body" body=""
	I1217 20:28:26.740197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:26.740507  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:27.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:28:27.240284  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:27.240622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:27.240679  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:27.740368  414292 type.go:168] "Request Body" body=""
	I1217 20:28:27.740447  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:27.740790  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:28.240689  414292 type.go:168] "Request Body" body=""
	I1217 20:28:28.240769  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:28.241066  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:28.740814  414292 type.go:168] "Request Body" body=""
	I1217 20:28:28.740891  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:28.741191  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:29.240973  414292 type.go:168] "Request Body" body=""
	I1217 20:28:29.241045  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:29.241401  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:29.241453  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:29.740076  414292 type.go:168] "Request Body" body=""
	I1217 20:28:29.740154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:29.740474  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:30.240176  414292 type.go:168] "Request Body" body=""
	I1217 20:28:30.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:30.240673  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:30.740407  414292 type.go:168] "Request Body" body=""
	I1217 20:28:30.740484  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:30.740834  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:31.240344  414292 type.go:168] "Request Body" body=""
	I1217 20:28:31.240421  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:31.240680  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:31.740183  414292 type.go:168] "Request Body" body=""
	I1217 20:28:31.740278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:31.740615  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:31.740674  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:32.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:28:32.240291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:32.240620  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:32.740110  414292 type.go:168] "Request Body" body=""
	I1217 20:28:32.740177  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:32.740450  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:33.240131  414292 type.go:168] "Request Body" body=""
	I1217 20:28:33.240205  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:33.240522  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:33.740095  414292 type.go:168] "Request Body" body=""
	I1217 20:28:33.740169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:33.740516  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:34.240083  414292 type.go:168] "Request Body" body=""
	I1217 20:28:34.240169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:34.240471  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:34.240522  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:34.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:28:34.740277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:34.740604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:35.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:28:35.240295  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:35.240596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:35.740083  414292 type.go:168] "Request Body" body=""
	I1217 20:28:35.740163  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:35.740501  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:36.240147  414292 type.go:168] "Request Body" body=""
	I1217 20:28:36.240220  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:36.240565  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:36.240620  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:36.740181  414292 type.go:168] "Request Body" body=""
	I1217 20:28:36.740270  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:36.740569  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:37.240126  414292 type.go:168] "Request Body" body=""
	I1217 20:28:37.240206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:37.240510  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:37.740120  414292 type.go:168] "Request Body" body=""
	I1217 20:28:37.740203  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:37.740533  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:38.240393  414292 type.go:168] "Request Body" body=""
	I1217 20:28:38.240473  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:38.240804  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:38.240859  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:38.740418  414292 type.go:168] "Request Body" body=""
	I1217 20:28:38.740493  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:38.740792  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:39.240679  414292 type.go:168] "Request Body" body=""
	I1217 20:28:39.240778  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:39.241109  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:39.740934  414292 type.go:168] "Request Body" body=""
	I1217 20:28:39.741010  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:39.741373  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:40.240097  414292 type.go:168] "Request Body" body=""
	I1217 20:28:40.240172  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:40.240452  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:40.740149  414292 type.go:168] "Request Body" body=""
	I1217 20:28:40.740221  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:40.740580  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:40.740634  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:41.240170  414292 type.go:168] "Request Body" body=""
	I1217 20:28:41.240273  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:41.240600  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:41.740062  414292 type.go:168] "Request Body" body=""
	I1217 20:28:41.740137  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:41.740430  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:42.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:28:42.240290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:42.240668  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:42.740377  414292 type.go:168] "Request Body" body=""
	I1217 20:28:42.740455  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:42.740779  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:42.740831  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:43.240367  414292 type.go:168] "Request Body" body=""
	I1217 20:28:43.240476  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:43.240764  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:43.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:28:43.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:43.740560  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:44.240459  414292 type.go:168] "Request Body" body=""
	I1217 20:28:44.240534  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:44.240870  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:44.740354  414292 type.go:168] "Request Body" body=""
	I1217 20:28:44.740429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:44.740695  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:45.240439  414292 type.go:168] "Request Body" body=""
	I1217 20:28:45.240568  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:45.241041  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:45.241102  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:45.740871  414292 type.go:168] "Request Body" body=""
	I1217 20:28:45.740956  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:45.741304  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:46.241069  414292 type.go:168] "Request Body" body=""
	I1217 20:28:46.241138  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:46.241455  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:46.740186  414292 type.go:168] "Request Body" body=""
	I1217 20:28:46.740277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:46.740602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:47.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:28:47.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:47.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:47.740189  414292 type.go:168] "Request Body" body=""
	I1217 20:28:47.740282  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:47.740601  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:47.740657  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:48.240600  414292 type.go:168] "Request Body" body=""
	I1217 20:28:48.240678  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:48.241014  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:48.740811  414292 type.go:168] "Request Body" body=""
	I1217 20:28:48.740888  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:48.741185  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:49.240973  414292 type.go:168] "Request Body" body=""
	I1217 20:28:49.241051  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:49.241327  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:49.741121  414292 type.go:168] "Request Body" body=""
	I1217 20:28:49.741215  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:49.741596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:49.741649  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:50.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:28:50.240243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:50.240599  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:50.740121  414292 type.go:168] "Request Body" body=""
	I1217 20:28:50.740197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:50.740508  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:51.240167  414292 type.go:168] "Request Body" body=""
	I1217 20:28:51.240239  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:51.240596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:51.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:28:51.740310  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:51.740686  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:52.240373  414292 type.go:168] "Request Body" body=""
	I1217 20:28:52.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:52.240709  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:52.240761  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:52.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:28:52.740289  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:52.740584  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:53.240185  414292 type.go:168] "Request Body" body=""
	I1217 20:28:53.240278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:53.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:53.740309  414292 type.go:168] "Request Body" body=""
	I1217 20:28:53.740380  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:53.740678  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:54.240655  414292 type.go:168] "Request Body" body=""
	I1217 20:28:54.240729  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:54.241066  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:54.241123  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:54.740804  414292 type.go:168] "Request Body" body=""
	I1217 20:28:54.740895  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:54.741266  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:55.241066  414292 type.go:168] "Request Body" body=""
	I1217 20:28:55.241142  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:55.241410  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:55.740129  414292 type.go:168] "Request Body" body=""
	I1217 20:28:55.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:55.740615  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:56.240178  414292 type.go:168] "Request Body" body=""
	I1217 20:28:56.240274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:56.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:56.740206  414292 type.go:168] "Request Body" body=""
	I1217 20:28:56.740289  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:56.740613  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:56.740664  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:57.240172  414292 type.go:168] "Request Body" body=""
	I1217 20:28:57.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:57.240598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:57.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:28:57.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:57.740644  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:58.240496  414292 type.go:168] "Request Body" body=""
	I1217 20:28:58.240566  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:58.240825  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:58.740558  414292 type.go:168] "Request Body" body=""
	I1217 20:28:58.740638  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:58.740975  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:58.741026  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:59.240760  414292 type.go:168] "Request Body" body=""
	I1217 20:28:59.240835  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:59.241171  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:59.740992  414292 type.go:168] "Request Body" body=""
	I1217 20:28:59.741067  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:59.741325  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:00.241179  414292 type.go:168] "Request Body" body=""
	I1217 20:29:00.241274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:00.241594  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:00.740545  414292 type.go:168] "Request Body" body=""
	I1217 20:29:00.740619  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:00.740922  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:01.240654  414292 type.go:168] "Request Body" body=""
	I1217 20:29:01.240729  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:01.241023  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:01.241072  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:01.740787  414292 type.go:168] "Request Body" body=""
	I1217 20:29:01.740865  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:01.741183  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:02.240971  414292 type.go:168] "Request Body" body=""
	I1217 20:29:02.241051  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:02.241393  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:02.740980  414292 type.go:168] "Request Body" body=""
	I1217 20:29:02.741059  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:02.741391  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:03.240071  414292 type.go:168] "Request Body" body=""
	I1217 20:29:03.240147  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:03.240491  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:03.740095  414292 type.go:168] "Request Body" body=""
	I1217 20:29:03.740172  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:03.740538  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:03.740593  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:04.240122  414292 type.go:168] "Request Body" body=""
	I1217 20:29:04.240206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:04.240574  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:04.740282  414292 type.go:168] "Request Body" body=""
	I1217 20:29:04.740362  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:04.740712  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:05.240168  414292 type.go:168] "Request Body" body=""
	I1217 20:29:05.240264  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:05.240599  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:05.740352  414292 type.go:168] "Request Body" body=""
	I1217 20:29:05.740431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:05.740697  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:05.740738  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:06.240370  414292 type.go:168] "Request Body" body=""
	I1217 20:29:06.240445  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:06.240788  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:06.741116  414292 type.go:168] "Request Body" body=""
	I1217 20:29:06.741191  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:06.741539  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:07.240080  414292 type.go:168] "Request Body" body=""
	I1217 20:29:07.240155  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:07.240463  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:07.740201  414292 type.go:168] "Request Body" body=""
	I1217 20:29:07.740303  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:07.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:08.240575  414292 type.go:168] "Request Body" body=""
	I1217 20:29:08.240648  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:08.241002  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:08.241060  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:08.740757  414292 type.go:168] "Request Body" body=""
	I1217 20:29:08.740826  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:08.741089  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:09.241059  414292 type.go:168] "Request Body" body=""
	I1217 20:29:09.241165  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:09.241510  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:09.740220  414292 type.go:168] "Request Body" body=""
	I1217 20:29:09.740308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:09.740656  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:10.240344  414292 type.go:168] "Request Body" body=""
	I1217 20:29:10.240444  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:10.240756  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:10.740205  414292 type.go:168] "Request Body" body=""
	I1217 20:29:10.740293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:10.740629  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:10.740685  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:11.240376  414292 type.go:168] "Request Body" body=""
	I1217 20:29:11.240454  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:11.240798  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:11.740377  414292 type.go:168] "Request Body" body=""
	I1217 20:29:11.740475  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:11.740809  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:12.240548  414292 type.go:168] "Request Body" body=""
	I1217 20:29:12.240621  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:12.240958  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:12.740753  414292 type.go:168] "Request Body" body=""
	I1217 20:29:12.740831  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:12.741131  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:12.741180  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:13.240743  414292 type.go:168] "Request Body" body=""
	I1217 20:29:13.240816  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:13.241082  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:13.740877  414292 type.go:168] "Request Body" body=""
	I1217 20:29:13.740957  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:13.741251  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:14.241067  414292 type.go:168] "Request Body" body=""
	I1217 20:29:14.241141  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:14.241456  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:14.740157  414292 type.go:168] "Request Body" body=""
	I1217 20:29:14.740261  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:14.740657  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:15.240363  414292 type.go:168] "Request Body" body=""
	I1217 20:29:15.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:15.240796  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:15.240849  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:15.740520  414292 type.go:168] "Request Body" body=""
	I1217 20:29:15.740600  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:15.740926  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:16.240670  414292 type.go:168] "Request Body" body=""
	I1217 20:29:16.240741  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:16.241005  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:16.740812  414292 type.go:168] "Request Body" body=""
	I1217 20:29:16.740895  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:16.741250  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:17.241030  414292 type.go:168] "Request Body" body=""
	I1217 20:29:17.241104  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:17.241485  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:17.241544  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:17.741007  414292 type.go:168] "Request Body" body=""
	I1217 20:29:17.741075  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:17.741330  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:18.240351  414292 type.go:168] "Request Body" body=""
	I1217 20:29:18.240431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:18.240777  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:18.740148  414292 type.go:168] "Request Body" body=""
	I1217 20:29:18.740233  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:18.740593  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:19.240118  414292 type.go:168] "Request Body" body=""
	I1217 20:29:19.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:19.240527  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:19.740167  414292 type.go:168] "Request Body" body=""
	I1217 20:29:19.740241  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:19.740598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:19.740652  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:20.240357  414292 type.go:168] "Request Body" body=""
	I1217 20:29:20.240435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:20.240764  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:20.740360  414292 type.go:168] "Request Body" body=""
	I1217 20:29:20.740433  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:20.740702  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:21.240449  414292 type.go:168] "Request Body" body=""
	I1217 20:29:21.240532  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:21.240864  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:21.740699  414292 type.go:168] "Request Body" body=""
	I1217 20:29:21.740783  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:21.741119  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:21.741177  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:22.240878  414292 type.go:168] "Request Body" body=""
	I1217 20:29:22.240947  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:22.241200  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:22.740985  414292 type.go:168] "Request Body" body=""
	I1217 20:29:22.741058  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:22.741409  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:23.240121  414292 type.go:168] "Request Body" body=""
	I1217 20:29:23.240195  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:23.240546  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:23.740154  414292 type.go:168] "Request Body" body=""
	I1217 20:29:23.740239  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:23.740563  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:24.240534  414292 type.go:168] "Request Body" body=""
	I1217 20:29:24.240612  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:24.240947  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:24.241000  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:24.740777  414292 type.go:168] "Request Body" body=""
	I1217 20:29:24.740857  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:24.741204  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:25.240998  414292 type.go:168] "Request Body" body=""
	I1217 20:29:25.241066  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:25.241333  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:25.741123  414292 type.go:168] "Request Body" body=""
	I1217 20:29:25.741202  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:25.741530  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:26.240201  414292 type.go:168] "Request Body" body=""
	I1217 20:29:26.240299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:26.240642  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:26.740361  414292 type.go:168] "Request Body" body=""
	I1217 20:29:26.740432  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:26.740744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:26.740792  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:27.240174  414292 type.go:168] "Request Body" body=""
	I1217 20:29:27.240243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:27.240585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:27.740308  414292 type.go:168] "Request Body" body=""
	I1217 20:29:27.740392  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:27.740767  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:28.240517  414292 type.go:168] "Request Body" body=""
	I1217 20:29:28.240588  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:28.240857  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:28.740679  414292 type.go:168] "Request Body" body=""
	I1217 20:29:28.740752  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:28.741120  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:28.741173  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:29.240914  414292 type.go:168] "Request Body" body=""
	I1217 20:29:29.240995  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:29.241335  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:29.740969  414292 type.go:168] "Request Body" body=""
	I1217 20:29:29.741045  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:29.741327  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:30.240069  414292 type.go:168] "Request Body" body=""
	I1217 20:29:30.240150  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:30.240512  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:30.740197  414292 type.go:168] "Request Body" body=""
	I1217 20:29:30.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:30.740646  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:31.240121  414292 type.go:168] "Request Body" body=""
	I1217 20:29:31.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:31.240521  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:31.240571  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:31.740162  414292 type.go:168] "Request Body" body=""
	I1217 20:29:31.740238  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:31.740576  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:32.240165  414292 type.go:168] "Request Body" body=""
	I1217 20:29:32.240262  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:32.240572  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:32.740149  414292 type.go:168] "Request Body" body=""
	I1217 20:29:32.740217  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:32.740555  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:33.240266  414292 type.go:168] "Request Body" body=""
	I1217 20:29:33.240342  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:33.240665  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:33.240725  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:33.740182  414292 type.go:168] "Request Body" body=""
	I1217 20:29:33.740280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:33.740619  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:34.240111  414292 type.go:168] "Request Body" body=""
	I1217 20:29:34.240178  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:34.240456  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:34.740169  414292 type.go:168] "Request Body" body=""
	I1217 20:29:34.740275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:34.740676  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:35.240238  414292 type.go:168] "Request Body" body=""
	I1217 20:29:35.240333  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:35.240684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:35.740366  414292 type.go:168] "Request Body" body=""
	I1217 20:29:35.740431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:35.740683  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:35.740724  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:36.240205  414292 type.go:168] "Request Body" body=""
	I1217 20:29:36.240297  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:36.240641  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:36.740372  414292 type.go:168] "Request Body" body=""
	I1217 20:29:36.740448  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:36.740761  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:37.240379  414292 type.go:168] "Request Body" body=""
	I1217 20:29:37.240448  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:37.240766  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:37.740215  414292 type.go:168] "Request Body" body=""
	I1217 20:29:37.740301  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:37.740614  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:38.240590  414292 type.go:168] "Request Body" body=""
	I1217 20:29:38.240670  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:38.241007  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:38.241051  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:38.740806  414292 type.go:168] "Request Body" body=""
	I1217 20:29:38.740880  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:38.741145  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:39.240077  414292 type.go:168] "Request Body" body=""
	I1217 20:29:39.240158  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:39.240533  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:39.740138  414292 type.go:168] "Request Body" body=""
	I1217 20:29:39.740216  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:39.740575  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:40.240271  414292 type.go:168] "Request Body" body=""
	I1217 20:29:40.240352  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:40.240630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:40.740330  414292 type.go:168] "Request Body" body=""
	I1217 20:29:40.740414  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:40.740751  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:40.740807  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:41.240185  414292 type.go:168] "Request Body" body=""
	I1217 20:29:41.240278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:41.240603  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:41.740073  414292 type.go:168] "Request Body" body=""
	I1217 20:29:41.740152  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:41.740438  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:42.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:29:42.240310  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:42.240671  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:42.740385  414292 type.go:168] "Request Body" body=""
	I1217 20:29:42.740463  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:42.740790  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:42.740846  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:43.240351  414292 type.go:168] "Request Body" body=""
	I1217 20:29:43.240416  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:43.240662  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:43.740171  414292 type.go:168] "Request Body" body=""
	I1217 20:29:43.740242  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:43.740569  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:44.240470  414292 type.go:168] "Request Body" body=""
	I1217 20:29:44.240555  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:44.241028  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:44.740766  414292 type.go:168] "Request Body" body=""
	I1217 20:29:44.740836  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:44.741107  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:44.741148  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:45.241078  414292 type.go:168] "Request Body" body=""
	I1217 20:29:45.241164  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:45.241604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:45.740201  414292 type.go:168] "Request Body" body=""
	I1217 20:29:45.740298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:45.740651  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:46.240352  414292 type.go:168] "Request Body" body=""
	I1217 20:29:46.240425  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:46.240697  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:46.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:29:46.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:46.740656  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:47.240376  414292 type.go:168] "Request Body" body=""
	I1217 20:29:47.240462  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:47.240824  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:47.240891  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:47.740486  414292 type.go:168] "Request Body" body=""
	I1217 20:29:47.740555  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:47.740822  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:48.240800  414292 type.go:168] "Request Body" body=""
	I1217 20:29:48.240885  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:48.241255  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:48.741124  414292 type.go:168] "Request Body" body=""
	I1217 20:29:48.741203  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:48.741664  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:49.240445  414292 type.go:168] "Request Body" body=""
	I1217 20:29:49.240522  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:49.240794  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:49.743007  414292 type.go:168] "Request Body" body=""
	I1217 20:29:49.743084  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:49.743421  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:49.743497  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:50.241133  414292 type.go:168] "Request Body" body=""
	I1217 20:29:50.241229  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:50.241648  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:50.740153  414292 type.go:168] "Request Body" body=""
	I1217 20:29:50.740228  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:50.740573  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:51.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:29:51.240301  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:51.240639  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:51.740398  414292 type.go:168] "Request Body" body=""
	I1217 20:29:51.740482  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:51.740812  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:52.240354  414292 type.go:168] "Request Body" body=""
	I1217 20:29:52.240429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:52.240749  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:52.240807  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:52.740175  414292 type.go:168] "Request Body" body=""
	I1217 20:29:52.740270  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:52.740621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:53.240207  414292 type.go:168] "Request Body" body=""
	I1217 20:29:53.240318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:53.240664  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:53.740074  414292 type.go:168] "Request Body" body=""
	I1217 20:29:53.740144  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:53.740420  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:54.240102  414292 type.go:168] "Request Body" body=""
	I1217 20:29:54.240179  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:54.240492  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:54.740171  414292 type.go:168] "Request Body" body=""
	I1217 20:29:54.740266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:54.740580  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:54.740639  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:55.240282  414292 type.go:168] "Request Body" body=""
	I1217 20:29:55.240356  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:55.240697  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:55.740173  414292 type.go:168] "Request Body" body=""
	I1217 20:29:55.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:55.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:56.240062  414292 type.go:168] "Request Body" body=""
	I1217 20:29:56.240140  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:56.240485  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:56.740174  414292 type.go:168] "Request Body" body=""
	I1217 20:29:56.740243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:56.740524  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:57.240191  414292 type.go:168] "Request Body" body=""
	I1217 20:29:57.240286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:57.240658  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:57.240715  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:57.740239  414292 type.go:168] "Request Body" body=""
	I1217 20:29:57.740339  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:57.740672  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:58.240460  414292 type.go:168] "Request Body" body=""
	I1217 20:29:58.240543  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:58.240844  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:58.740210  414292 type.go:168] "Request Body" body=""
	I1217 20:29:58.740304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:58.740643  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:59.240481  414292 type.go:168] "Request Body" body=""
	I1217 20:29:59.240553  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:59.240919  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:59.240975  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:59.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:29:59.740429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:59.740722  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:00.240635  414292 type.go:168] "Request Body" body=""
	I1217 20:30:00.240720  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:00.241049  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:00.741056  414292 type.go:168] "Request Body" body=""
	I1217 20:30:00.741139  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:00.741446  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:01.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:30:01.240243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:01.240569  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:01.740198  414292 type.go:168] "Request Body" body=""
	I1217 20:30:01.740313  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:01.740647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:01.740706  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:02.240225  414292 type.go:168] "Request Body" body=""
	I1217 20:30:02.240325  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:02.240684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:02.740220  414292 type.go:168] "Request Body" body=""
	I1217 20:30:02.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:02.740599  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:03.240198  414292 type.go:168] "Request Body" body=""
	I1217 20:30:03.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:03.240638  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:03.740375  414292 type.go:168] "Request Body" body=""
	I1217 20:30:03.740447  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:03.740781  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:03.740851  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:04.240827  414292 type.go:168] "Request Body" body=""
	I1217 20:30:04.240905  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:04.241246  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:04.741038  414292 type.go:168] "Request Body" body=""
	I1217 20:30:04.741117  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:04.741482  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:05.240086  414292 type.go:168] "Request Body" body=""
	I1217 20:30:05.240166  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:05.240523  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:05.740218  414292 type.go:168] "Request Body" body=""
	I1217 20:30:05.740313  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:05.740583  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:06.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:30:06.240296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:06.240608  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:06.240657  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:06.740166  414292 type.go:168] "Request Body" body=""
	I1217 20:30:06.740292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:06.740641  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:07.240137  414292 type.go:168] "Request Body" body=""
	I1217 20:30:07.240232  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:07.240542  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:07.740212  414292 type.go:168] "Request Body" body=""
	I1217 20:30:07.740319  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:07.740677  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:08.240572  414292 type.go:168] "Request Body" body=""
	I1217 20:30:08.240703  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:08.241012  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:08.241060  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:08.740470  414292 type.go:168] "Request Body" body=""
	I1217 20:30:08.740547  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:08.740807  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:09.240838  414292 type.go:168] "Request Body" body=""
	I1217 20:30:09.240937  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:09.241278  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:09.741105  414292 type.go:168] "Request Body" body=""
	I1217 20:30:09.741196  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:09.741522  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:10.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:30:10.240237  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:10.240593  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:10.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:30:10.740286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:10.740639  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:10.740692  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:11.240192  414292 type.go:168] "Request Body" body=""
	I1217 20:30:11.240314  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:11.240666  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:11.740282  414292 type.go:168] "Request Body" body=""
	I1217 20:30:11.740357  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:11.740632  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:12.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:30:12.240296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:12.240602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:12.740302  414292 type.go:168] "Request Body" body=""
	I1217 20:30:12.740382  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:12.740745  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:12.740799  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:13.240268  414292 type.go:168] "Request Body" body=""
	I1217 20:30:13.240338  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:13.240649  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:13.740420  414292 type.go:168] "Request Body" body=""
	I1217 20:30:13.740503  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:13.740905  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:14.240722  414292 type.go:168] "Request Body" body=""
	I1217 20:30:14.240802  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:14.241096  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:14.740830  414292 type.go:168] "Request Body" body=""
	I1217 20:30:14.740904  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:14.741180  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:14.741236  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:15.241024  414292 type.go:168] "Request Body" body=""
	I1217 20:30:15.241104  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:15.241450  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:15.741152  414292 type.go:168] "Request Body" body=""
	I1217 20:30:15.741234  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:15.741523  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:16.240202  414292 type.go:168] "Request Body" body=""
	I1217 20:30:16.240287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:16.240602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:16.740191  414292 type.go:168] "Request Body" body=""
	I1217 20:30:16.740289  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:16.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:17.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:30:17.240275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:17.240605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:17.240659  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:17.740234  414292 type.go:168] "Request Body" body=""
	I1217 20:30:17.740318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:17.740651  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:18.240499  414292 type.go:168] "Request Body" body=""
	I1217 20:30:18.240576  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:18.240897  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:18.740209  414292 type.go:168] "Request Body" body=""
	I1217 20:30:18.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:18.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:19.240591  414292 type.go:168] "Request Body" body=""
	I1217 20:30:19.240664  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:19.240919  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:19.240962  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:19.740705  414292 type.go:168] "Request Body" body=""
	I1217 20:30:19.740783  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:19.741126  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:20.240841  414292 type.go:168] "Request Body" body=""
	I1217 20:30:20.240934  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:20.241283  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:20.741051  414292 type.go:168] "Request Body" body=""
	I1217 20:30:20.741126  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:20.741393  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:21.240114  414292 type.go:168] "Request Body" body=""
	I1217 20:30:21.240193  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:21.240534  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:21.740204  414292 type.go:168] "Request Body" body=""
	I1217 20:30:21.740298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:21.740636  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:21.740690  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:22.240114  414292 type.go:168] "Request Body" body=""
	I1217 20:30:22.240185  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:22.240483  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:22.740163  414292 type.go:168] "Request Body" body=""
	I1217 20:30:22.740266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:22.740598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:23.240329  414292 type.go:168] "Request Body" body=""
	I1217 20:30:23.240410  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:23.240708  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:23.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:30:23.740421  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:23.740715  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:23.740754  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:24.240749  414292 type.go:168] "Request Body" body=""
	I1217 20:30:24.240827  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:24.241165  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:24.740945  414292 type.go:168] "Request Body" body=""
	I1217 20:30:24.741018  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:24.741341  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:25.241107  414292 type.go:168] "Request Body" body=""
	I1217 20:30:25.241181  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:25.241513  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:25.740206  414292 type.go:168] "Request Body" body=""
	I1217 20:30:25.740297  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:25.740647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:26.240231  414292 type.go:168] "Request Body" body=""
	I1217 20:30:26.240323  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:26.240661  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:26.240712  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:26.740311  414292 type.go:168] "Request Body" body=""
	I1217 20:30:26.740386  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:26.740706  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:27.240152  414292 type.go:168] "Request Body" body=""
	I1217 20:30:27.240224  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:27.240591  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:27.740314  414292 type.go:168] "Request Body" body=""
	I1217 20:30:27.740399  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:27.740736  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:28.240510  414292 type.go:168] "Request Body" body=""
	I1217 20:30:28.240580  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:28.240845  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:28.240885  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:28.740684  414292 type.go:168] "Request Body" body=""
	I1217 20:30:28.740769  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:28.741122  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:29.240868  414292 type.go:168] "Request Body" body=""
	I1217 20:30:29.240943  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:29.241278  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:29.741010  414292 type.go:168] "Request Body" body=""
	I1217 20:30:29.741088  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:29.741347  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:30.240074  414292 type.go:168] "Request Body" body=""
	I1217 20:30:30.240156  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:30.240536  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:30.740270  414292 type.go:168] "Request Body" body=""
	I1217 20:30:30.740350  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:30.740682  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:30.740742  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:31.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:30:31.240426  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:31.240709  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:31.740175  414292 type.go:168] "Request Body" body=""
	I1217 20:30:31.740274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:31.740607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:32.240196  414292 type.go:168] "Request Body" body=""
	I1217 20:30:32.240290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:32.240644  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:32.740323  414292 type.go:168] "Request Body" body=""
	I1217 20:30:32.740399  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:32.740721  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:32.740777  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:33.240191  414292 type.go:168] "Request Body" body=""
	I1217 20:30:33.240282  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:33.240581  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:33.740289  414292 type.go:168] "Request Body" body=""
	I1217 20:30:33.740384  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:33.740725  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:34.240516  414292 type.go:168] "Request Body" body=""
	I1217 20:30:34.240601  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:34.240877  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:34.740166  414292 type.go:168] "Request Body" body=""
	I1217 20:30:34.740263  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:34.740598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:35.240329  414292 type.go:168] "Request Body" body=""
	I1217 20:30:35.240407  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:35.240744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:35.240807  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:35.740356  414292 type.go:168] "Request Body" body=""
	I1217 20:30:35.740430  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:35.740684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:36.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:30:36.240292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:36.240620  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:36.740204  414292 type.go:168] "Request Body" body=""
	I1217 20:30:36.740299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:36.740621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:37.240083  414292 type.go:168] "Request Body" body=""
	I1217 20:30:37.240154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:37.240449  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:37.740150  414292 type.go:168] "Request Body" body=""
	I1217 20:30:37.740225  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:37.740559  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:37.740619  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:38.240565  414292 type.go:168] "Request Body" body=""
	I1217 20:30:38.240642  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:38.240952  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:38.740678  414292 type.go:168] "Request Body" body=""
	I1217 20:30:38.740753  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:38.741008  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:39.241038  414292 type.go:168] "Request Body" body=""
	I1217 20:30:39.241115  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:39.241418  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:39.740131  414292 type.go:168] "Request Body" body=""
	I1217 20:30:39.740209  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:39.740536  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:40.240118  414292 type.go:168] "Request Body" body=""
	I1217 20:30:40.240199  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:40.240498  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:40.240543  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:40.740108  414292 type.go:168] "Request Body" body=""
	I1217 20:30:40.740188  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:40.740547  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:41.240309  414292 type.go:168] "Request Body" body=""
	I1217 20:30:41.240393  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:41.240720  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:41.740366  414292 type.go:168] "Request Body" body=""
	I1217 20:30:41.740436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:41.740701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:42.240206  414292 type.go:168] "Request Body" body=""
	I1217 20:30:42.240307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:42.240670  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:42.240729  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:42.740417  414292 type.go:168] "Request Body" body=""
	I1217 20:30:42.740498  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:42.740825  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:43.240360  414292 type.go:168] "Request Body" body=""
	I1217 20:30:43.240441  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:43.240782  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:43.740183  414292 type.go:168] "Request Body" body=""
	I1217 20:30:43.740276  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:43.740607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:44.240187  414292 type.go:168] "Request Body" body=""
	I1217 20:30:44.240292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:44.240612  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:44.740076  414292 type.go:168] "Request Body" body=""
	I1217 20:30:44.740144  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:44.740421  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:44.740464  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:45.240192  414292 type.go:168] "Request Body" body=""
	I1217 20:30:45.240361  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:45.240784  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:45.740213  414292 type.go:168] "Request Body" body=""
	I1217 20:30:45.740320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:45.740704  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:46.241036  414292 type.go:168] "Request Body" body=""
	I1217 20:30:46.241111  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:46.241363  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:46.741131  414292 type.go:168] "Request Body" body=""
	I1217 20:30:46.741206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:46.741497  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:46.741543  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:47.240089  414292 type.go:168] "Request Body" body=""
	I1217 20:30:47.240162  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:47.240507  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:47.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:30:47.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:47.740533  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:48.240579  414292 type.go:168] "Request Body" body=""
	I1217 20:30:48.240662  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:48.240968  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:48.740776  414292 type.go:168] "Request Body" body=""
	I1217 20:30:48.740848  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:48.741156  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:49.241082  414292 type.go:168] "Request Body" body=""
	I1217 20:30:49.241169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:49.241428  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:49.241479  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:49.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:30:49.740280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:49.740617  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:50.240370  414292 type.go:168] "Request Body" body=""
	I1217 20:30:50.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:50.240786  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:50.740363  414292 type.go:168] "Request Body" body=""
	I1217 20:30:50.740431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:50.740703  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:51.240208  414292 type.go:168] "Request Body" body=""
	I1217 20:30:51.240317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:51.240620  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:51.740346  414292 type.go:168] "Request Body" body=""
	I1217 20:30:51.740425  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:51.740761  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:51.740821  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:52.240214  414292 type.go:168] "Request Body" body=""
	I1217 20:30:52.240300  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:52.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:52.740220  414292 type.go:168] "Request Body" body=""
	I1217 20:30:52.740317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:52.740663  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:53.240242  414292 type.go:168] "Request Body" body=""
	I1217 20:30:53.240341  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:53.240677  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:53.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:30:53.740434  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:53.740762  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:54.240735  414292 type.go:168] "Request Body" body=""
	I1217 20:30:54.240807  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:54.241141  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:54.241194  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:54.740836  414292 type.go:168] "Request Body" body=""
	I1217 20:30:54.740927  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:54.741298  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:55.241043  414292 type.go:168] "Request Body" body=""
	I1217 20:30:55.241118  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:55.241396  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:55.740103  414292 type.go:168] "Request Body" body=""
	I1217 20:30:55.740188  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:55.740556  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:56.240179  414292 type.go:168] "Request Body" body=""
	I1217 20:30:56.240272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:56.240684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:56.740352  414292 type.go:168] "Request Body" body=""
	I1217 20:30:56.740424  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:56.740685  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:56.740726  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:57.240185  414292 type.go:168] "Request Body" body=""
	I1217 20:30:57.240275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:57.240613  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:57.740358  414292 type.go:168] "Request Body" body=""
	I1217 20:30:57.740443  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:57.740790  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:58.240510  414292 type.go:168] "Request Body" body=""
	I1217 20:30:58.240581  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:58.240843  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:58.740171  414292 type.go:168] "Request Body" body=""
	I1217 20:30:58.740269  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:58.740606  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:59.240200  414292 type.go:168] "Request Body" body=""
	I1217 20:30:59.240287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:59.240645  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:59.240707  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:59.740354  414292 type.go:168] "Request Body" body=""
	I1217 20:30:59.740428  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:59.740694  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:00.240295  414292 type.go:168] "Request Body" body=""
	I1217 20:31:00.240376  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:00.240702  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:00.740561  414292 type.go:168] "Request Body" body=""
	I1217 20:31:00.740646  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:00.741014  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:01.240812  414292 type.go:168] "Request Body" body=""
	I1217 20:31:01.240892  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:01.241169  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:01.241213  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:01.740960  414292 type.go:168] "Request Body" body=""
	I1217 20:31:01.741043  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:01.741371  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:02.240103  414292 type.go:168] "Request Body" body=""
	I1217 20:31:02.240187  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:02.240566  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:02.740227  414292 type.go:168] "Request Body" body=""
	I1217 20:31:02.740315  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:02.740637  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:03.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:31:03.240291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:03.240590  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:03.740279  414292 type.go:168] "Request Body" body=""
	I1217 20:31:03.740349  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:03.740684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:03.740743  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:04.240506  414292 type.go:168] "Request Body" body=""
	I1217 20:31:04.240579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:04.240829  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:04.740200  414292 type.go:168] "Request Body" body=""
	I1217 20:31:04.740299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:04.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:05.240188  414292 type.go:168] "Request Body" body=""
	I1217 20:31:05.240285  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:05.240600  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:05.740119  414292 type.go:168] "Request Body" body=""
	I1217 20:31:05.740198  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:05.740527  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:06.240220  414292 type.go:168] "Request Body" body=""
	I1217 20:31:06.240320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:06.240652  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:06.240704  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:06.740395  414292 type.go:168] "Request Body" body=""
	I1217 20:31:06.740474  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:06.740826  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:07.240362  414292 type.go:168] "Request Body" body=""
	I1217 20:31:07.240437  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:07.240699  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:07.740382  414292 type.go:168] "Request Body" body=""
	I1217 20:31:07.740456  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:07.740780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:08.240694  414292 type.go:168] "Request Body" body=""
	I1217 20:31:08.240775  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:08.241125  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:08.241178  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:08.740933  414292 type.go:168] "Request Body" body=""
	I1217 20:31:08.741009  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:08.741272  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:09.240107  414292 type.go:168] "Request Body" body=""
	I1217 20:31:09.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:09.240509  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:09.740211  414292 type.go:168] "Request Body" body=""
	I1217 20:31:09.740317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:09.740653  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:10.240147  414292 type.go:168] "Request Body" body=""
	I1217 20:31:10.240221  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:10.240567  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:10.740283  414292 type.go:168] "Request Body" body=""
	I1217 20:31:10.740362  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:10.740720  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:10.740781  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:11.240308  414292 type.go:168] "Request Body" body=""
	I1217 20:31:11.240387  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:11.240742  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:11.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:31:11.740427  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:11.740686  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:12.240168  414292 type.go:168] "Request Body" body=""
	I1217 20:31:12.240265  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:12.240582  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:12.740305  414292 type.go:168] "Request Body" body=""
	I1217 20:31:12.740382  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:12.740717  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:13.240360  414292 type.go:168] "Request Body" body=""
	I1217 20:31:13.240435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:13.240753  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:13.240804  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:13.740494  414292 type.go:168] "Request Body" body=""
	I1217 20:31:13.740566  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:13.740865  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:14.240695  414292 type.go:168] "Request Body" body=""
	I1217 20:31:14.240775  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:14.241120  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:14.740883  414292 type.go:168] "Request Body" body=""
	I1217 20:31:14.740949  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:14.741207  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:15.241001  414292 type.go:168] "Request Body" body=""
	I1217 20:31:15.241086  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:15.241424  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:15.241480  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:15.740181  414292 type.go:168] "Request Body" body=""
	I1217 20:31:15.740286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:15.740606  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:16.240120  414292 type.go:168] "Request Body" body=""
	I1217 20:31:16.240190  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:16.240504  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:16.740165  414292 type.go:168] "Request Body" body=""
	I1217 20:31:16.740263  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:16.740588  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:17.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:31:17.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:17.240612  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:17.740176  414292 type.go:168] "Request Body" body=""
	I1217 20:31:17.740245  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:17.740595  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:17.740646  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:18.240578  414292 type.go:168] "Request Body" body=""
	I1217 20:31:18.240656  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:18.241010  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:18.740197  414292 type.go:168] "Request Body" body=""
	I1217 20:31:18.740299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:18.740622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:19.240589  414292 type.go:168] "Request Body" body=""
	I1217 20:31:19.240665  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:19.240973  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:19.740527  414292 type.go:168] "Request Body" body=""
	I1217 20:31:19.740602  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:19.740938  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:19.741004  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:20.240746  414292 type.go:168] "Request Body" body=""
	I1217 20:31:20.240826  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:20.241171  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:20.740847  414292 type.go:168] "Request Body" body=""
	I1217 20:31:20.740927  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:20.741205  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:21.240999  414292 type.go:168] "Request Body" body=""
	I1217 20:31:21.241077  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:21.241433  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:21.741076  414292 type.go:168] "Request Body" body=""
	I1217 20:31:21.741157  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:21.741476  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:21.741535  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:22.240148  414292 type.go:168] "Request Body" body=""
	I1217 20:31:22.240225  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:22.240595  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:22.740191  414292 type.go:168] "Request Body" body=""
	I1217 20:31:22.740288  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:22.740671  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:23.240361  414292 type.go:168] "Request Body" body=""
	I1217 20:31:23.240439  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:23.240766  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:23.740367  414292 type.go:168] "Request Body" body=""
	I1217 20:31:23.740436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:23.740727  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:24.240746  414292 type.go:168] "Request Body" body=""
	I1217 20:31:24.240834  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:24.241219  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:24.241275  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:24.740840  414292 type.go:168] "Request Body" body=""
	I1217 20:31:24.740915  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:24.741224  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:25.240994  414292 type.go:168] "Request Body" body=""
	I1217 20:31:25.241066  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:25.241325  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:25.741146  414292 type.go:168] "Request Body" body=""
	I1217 20:31:25.741238  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:25.741600  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:26.240173  414292 type.go:168] "Request Body" body=""
	I1217 20:31:26.240274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:26.240566  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:26.740222  414292 type.go:168] "Request Body" body=""
	I1217 20:31:26.740302  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:26.740563  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:26.740604  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:27.240174  414292 type.go:168] "Request Body" body=""
	I1217 20:31:27.240265  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:27.240586  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:27.740174  414292 type.go:168] "Request Body" body=""
	I1217 20:31:27.740267  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:27.740588  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:28.240508  414292 type.go:168] "Request Body" body=""
	I1217 20:31:28.240579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:28.240847  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:28.740580  414292 type.go:168] "Request Body" body=""
	I1217 20:31:28.740654  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:28.740974  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:28.741030  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:29.240927  414292 type.go:168] "Request Body" body=""
	I1217 20:31:29.241003  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:29.241345  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:29.740932  414292 type.go:168] "Request Body" body=""
	I1217 20:31:29.741003  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:29.741297  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:30.240066  414292 type.go:168] "Request Body" body=""
	I1217 20:31:30.240144  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:30.240477  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:30.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:31:30.740297  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:30.740655  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:31.240152  414292 type.go:168] "Request Body" body=""
	I1217 20:31:31.240227  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:31.240525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:31.240572  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:31.740177  414292 type.go:168] "Request Body" body=""
	I1217 20:31:31.740274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:31.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:32.240364  414292 type.go:168] "Request Body" body=""
	I1217 20:31:32.240441  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:32.240793  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:32.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:31:32.740429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:32.740739  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:33.240174  414292 type.go:168] "Request Body" body=""
	I1217 20:31:33.240265  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:33.240586  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:33.240635  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:33.740239  414292 type.go:168] "Request Body" body=""
	I1217 20:31:33.740336  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:33.740654  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:34.240597  414292 type.go:168] "Request Body" body=""
	I1217 20:31:34.240677  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:34.240945  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:34.740715  414292 type.go:168] "Request Body" body=""
	I1217 20:31:34.740794  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:34.741113  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:35.240931  414292 type.go:168] "Request Body" body=""
	I1217 20:31:35.241005  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:35.241378  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:35.241431  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:35.740086  414292 type.go:168] "Request Body" body=""
	I1217 20:31:35.740156  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:35.740458  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:36.240166  414292 type.go:168] "Request Body" body=""
	I1217 20:31:36.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:36.240589  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:36.740185  414292 type.go:168] "Request Body" body=""
	I1217 20:31:36.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:36.740585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:37.240233  414292 type.go:168] "Request Body" body=""
	I1217 20:31:37.240320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:37.240565  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:37.740177  414292 type.go:168] "Request Body" body=""
	I1217 20:31:37.740273  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:37.740567  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:37.740616  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:38.240625  414292 type.go:168] "Request Body" body=""
	I1217 20:31:38.240697  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:38.241070  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:38.740867  414292 type.go:168] "Request Body" body=""
	I1217 20:31:38.740936  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:38.741194  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:39.240063  414292 type.go:168] "Request Body" body=""
	I1217 20:31:39.240204  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:39.240542  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:39.740275  414292 type.go:168] "Request Body" body=""
	I1217 20:31:39.740351  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:39.740669  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:39.740728  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:40.240374  414292 type.go:168] "Request Body" body=""
	I1217 20:31:40.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:40.240701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:40.740211  414292 type.go:168] "Request Body" body=""
	I1217 20:31:40.740308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:40.740679  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:41.240409  414292 type.go:168] "Request Body" body=""
	I1217 20:31:41.240499  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:41.240858  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:41.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:31:41.740455  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:41.740717  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:41.740768  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:42.240201  414292 type.go:168] "Request Body" body=""
	I1217 20:31:42.240311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:42.240703  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:42.740571  414292 type.go:168] "Request Body" body=""
	I1217 20:31:42.740645  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:42.740967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:43.240727  414292 type.go:168] "Request Body" body=""
	I1217 20:31:43.240796  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:43.241050  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:43.740827  414292 type.go:168] "Request Body" body=""
	I1217 20:31:43.740901  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:43.741236  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:43.741293  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:44.241091  414292 type.go:168] "Request Body" body=""
	I1217 20:31:44.241176  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:44.241525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:44.740194  414292 type.go:168] "Request Body" body=""
	I1217 20:31:44.740280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:44.745967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=5
	I1217 20:31:45.240798  414292 type.go:168] "Request Body" body=""
	I1217 20:31:45.240901  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:45.241310  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:45.741143  414292 type.go:168] "Request Body" body=""
	I1217 20:31:45.741226  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:45.741583  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:45.741646  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:46.241073  414292 type.go:168] "Request Body" body=""
	I1217 20:31:46.241146  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:46.241399  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:46.740173  414292 type.go:168] "Request Body" body=""
	I1217 20:31:46.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:46.740602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:47.240190  414292 type.go:168] "Request Body" body=""
	I1217 20:31:47.240283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:47.240589  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:47.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:31:47.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:47.740649  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:48.240470  414292 type.go:168] "Request Body" body=""
	I1217 20:31:48.240554  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:48.241013  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:48.241064  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:48.740173  414292 type.go:168] "Request Body" body=""
	I1217 20:31:48.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:48.740603  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:49.240608  414292 type.go:168] "Request Body" body=""
	I1217 20:31:49.240675  414292 node_ready.go:38] duration metric: took 6m0.000721639s for node "functional-682596" to be "Ready" ...
	I1217 20:31:49.243794  414292 out.go:203] 
	W1217 20:31:49.246551  414292 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1217 20:31:49.246575  414292 out.go:285] * 
	W1217 20:31:49.249079  414292 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1217 20:31:49.251429  414292 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.001848112Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.001930616Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.002035519Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.002109095Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.002171430Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.002240592Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.002308310Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.002378620Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.002450473Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.002551603Z" level=info msg="Connect containerd service"
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.003013015Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.003712558Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.020658049Z" level=info msg="Start subscribing containerd event"
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.020740938Z" level=info msg="Start recovering state"
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.023140602Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.023368658Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.066191638Z" level=info msg="Start event monitor"
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.066246202Z" level=info msg="Start cni network conf syncer for default"
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.066258756Z" level=info msg="Start streaming server"
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.066268225Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.066275971Z" level=info msg="runtime interface starting up..."
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.066282896Z" level=info msg="starting plugins..."
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.066296787Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 17 20:25:47 functional-682596 systemd[1]: Started containerd.service - containerd container runtime.
	Dec 17 20:25:47 functional-682596 containerd[5330]: time="2025-12-17T20:25:47.067121641Z" level=info msg="containerd successfully booted in 0.091187s"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:31:53.561516    8647 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:31:53.562830    8647 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:31:53.563309    8647 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:31:53.565137    8647 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:31:53.565497    8647 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec17 17:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015536] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.514164] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034184] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.806183] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.649674] kauditd_printk_skb: 36 callbacks suppressed
	[Dec17 19:37] hrtimer: interrupt took 15014583 ns
	[Dec17 19:39] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:06] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:17] FS-Cache: Duplicate cookie detected
	[  +0.000767] FS-Cache: O-cookie c=00000031 [p=00000002 fl=222 nc=0 na=1]
	[  +0.001036] FS-Cache: O-cookie d=00000000b1f70094{9P.session} n=000000004124fba5
	[  +0.001177] FS-Cache: O-key=[10] '34323937353834383437'
	[  +0.000816] FS-Cache: N-cookie c=00000032 [p=00000002 fl=2 nc=0 na=1]
	[  +0.001043] FS-Cache: N-cookie d=00000000b1f70094{9P.session} n=000000009cece4cf
	[  +0.001160] FS-Cache: N-key=[10] '34323937353834383437'
	
	
	==> kernel <==
	 20:31:53 up  3:14,  0 user,  load average: 0.24, 0.31, 0.77
	Linux functional-682596 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 17 20:31:50 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:31:50 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 811.
	Dec 17 20:31:50 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:31:50 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:31:50 functional-682596 kubelet[8434]: E1217 20:31:50.809013    8434 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:31:50 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:31:50 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:31:51 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 812.
	Dec 17 20:31:51 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:31:51 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:31:51 functional-682596 kubelet[8524]: E1217 20:31:51.554233    8524 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:31:51 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:31:51 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:31:52 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 813.
	Dec 17 20:31:52 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:31:52 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:31:52 functional-682596 kubelet[8544]: E1217 20:31:52.300880    8544 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:31:52 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:31:52 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:31:52 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 814.
	Dec 17 20:31:52 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:31:52 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:31:53 functional-682596 kubelet[8565]: E1217 20:31:53.056272    8565 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:31:53 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:31:53 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596: exit status 2 (377.806875ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-682596" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods (2.33s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd (2.34s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd
functional_test.go:731: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 kubectl -- --context functional-682596 get pods
functional_test.go:731: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 kubectl -- --context functional-682596 get pods: exit status 1 (107.227768ms)

                                                
                                                
** stderr ** 
	E1217 20:32:01.695833  419447 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:32:01.696162  419447 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:32:01.697667  419447 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:32:01.697995  419447 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:32:01.699439  419447 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:734: failed to get pods. args "out/minikube-linux-arm64 -p functional-682596 kubectl -- --context functional-682596 get pods": exit status 1
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-682596
helpers_test.go:244: (dbg) docker inspect functional-682596:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	        "Created": "2025-12-17T20:17:26.774929696Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 408854,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-17T20:17:26.844564666Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:2a6398fc76fc21dc0a77ac54600c2604c101bff52e66ecf65f88ec0f1a8cff2d",
	        "ResolvConfPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hostname",
	        "HostsPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hosts",
	        "LogPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77-json.log",
	        "Name": "/functional-682596",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-682596:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-682596",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	                "LowerDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268-init/diff:/var/lib/docker/overlay2/83c8e6311894730d80a5439b5d4991744e9cfa6d0015df9caca346d57baf92e8/diff",
	                "MergedDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/merged",
	                "UpperDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/diff",
	                "WorkDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-682596",
	                "Source": "/var/lib/docker/volumes/functional-682596/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-682596",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-682596",
	                "name.minikube.sigs.k8s.io": "functional-682596",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "8e0f8d4915f888f90df7adb000bd0e749885d304e33053e85751193487b627b9",
	            "SandboxKey": "/var/run/docker/netns/8e0f8d4915f8",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33163"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33164"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33167"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33165"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33166"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-682596": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "de:95:c1:d9:d4:32",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "9e66e4dbc8284f728f81715f37c51d8272e96fcac9fb378874c982b3077b6cc2",
	                    "EndpointID": "0db3c56cfb2be75c981ed53adcc07de7cd33db60d51c01b0e875c8d41cf02897",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-682596",
	                        "efc9468a7e55"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596: exit status 2 (354.320741ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                         ARGS                                                                          │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image   │ functional-032730 image ls --format yaml --alsologtostderr                                                                                            │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image   │ functional-032730 image ls --format short --alsologtostderr                                                                                           │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image   │ functional-032730 image ls --format json --alsologtostderr                                                                                            │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image   │ functional-032730 image ls --format table --alsologtostderr                                                                                           │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ ssh     │ functional-032730 ssh pgrep buildkitd                                                                                                                 │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │                     │
	│ image   │ functional-032730 image build -t localhost/my-image:functional-032730 testdata/build --alsologtostderr                                                │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image   │ functional-032730 image ls                                                                                                                            │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ delete  │ -p functional-032730                                                                                                                                  │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ start   │ -p functional-682596 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1 │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │                     │
	│ start   │ -p functional-682596 --alsologtostderr -v=8                                                                                                           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:25 UTC │                     │
	│ cache   │ functional-682596 cache add registry.k8s.io/pause:3.1                                                                                                 │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ functional-682596 cache add registry.k8s.io/pause:3.3                                                                                                 │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ functional-682596 cache add registry.k8s.io/pause:latest                                                                                              │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ functional-682596 cache add minikube-local-cache-test:functional-682596                                                                               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ functional-682596 cache delete minikube-local-cache-test:functional-682596                                                                            │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                      │ minikube          │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ list                                                                                                                                                  │ minikube          │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ ssh     │ functional-682596 ssh sudo crictl images                                                                                                              │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ ssh     │ functional-682596 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                    │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ ssh     │ functional-682596 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │                     │
	│ cache   │ functional-682596 cache reload                                                                                                                        │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ ssh     │ functional-682596 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                      │ minikube          │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                   │ minikube          │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ kubectl │ functional-682596 kubectl -- --context functional-682596 get pods                                                                                     │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │                     │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/17 20:25:44
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1217 20:25:44.045489  414292 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:25:44.045686  414292 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:25:44.045714  414292 out.go:374] Setting ErrFile to fd 2...
	I1217 20:25:44.045733  414292 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:25:44.046029  414292 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:25:44.046470  414292 out.go:368] Setting JSON to false
	I1217 20:25:44.047409  414292 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":11289,"bootTime":1765991855,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:25:44.047515  414292 start.go:143] virtualization:  
	I1217 20:25:44.053027  414292 out.go:179] * [functional-682596] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1217 20:25:44.056011  414292 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 20:25:44.056093  414292 notify.go:221] Checking for updates...
	I1217 20:25:44.061883  414292 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:25:44.064833  414292 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:44.067589  414292 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:25:44.070446  414292 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 20:25:44.073380  414292 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 20:25:44.076968  414292 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:25:44.077128  414292 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:25:44.112208  414292 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:25:44.112455  414292 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:25:44.167112  414292 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 20:25:44.158029599 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:25:44.167209  414292 docker.go:319] overlay module found
	I1217 20:25:44.170171  414292 out.go:179] * Using the docker driver based on existing profile
	I1217 20:25:44.173086  414292 start.go:309] selected driver: docker
	I1217 20:25:44.173109  414292 start.go:927] validating driver "docker" against &{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableC
oreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:25:44.173214  414292 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 20:25:44.173330  414292 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:25:44.234258  414292 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 20:25:44.225129855 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:25:44.234785  414292 cni.go:84] Creating CNI manager for ""
	I1217 20:25:44.234848  414292 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:25:44.234909  414292 start.go:353] cluster config:
	{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath:
StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:25:44.238034  414292 out.go:179] * Starting "functional-682596" primary control-plane node in "functional-682596" cluster
	I1217 20:25:44.240853  414292 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1217 20:25:44.243760  414292 out.go:179] * Pulling base image v0.0.48-1765661130-22141 ...
	I1217 20:25:44.246713  414292 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:25:44.246768  414292 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1217 20:25:44.246782  414292 cache.go:65] Caching tarball of preloaded images
	I1217 20:25:44.246797  414292 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon
	I1217 20:25:44.246869  414292 preload.go:238] Found /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1217 20:25:44.246880  414292 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1217 20:25:44.246994  414292 profile.go:143] Saving config to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/config.json ...
	I1217 20:25:44.265764  414292 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon, skipping pull
	I1217 20:25:44.265789  414292 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 exists in daemon, skipping load
	I1217 20:25:44.265812  414292 cache.go:243] Successfully downloaded all kic artifacts
	I1217 20:25:44.265841  414292 start.go:360] acquireMachinesLock for functional-682596: {Name:mk49b95a4c72eb2d15a1ae0f35918a9843d0b3df Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1217 20:25:44.265903  414292 start.go:364] duration metric: took 36.013µs to acquireMachinesLock for "functional-682596"
	I1217 20:25:44.265927  414292 start.go:96] Skipping create...Using existing machine configuration
	I1217 20:25:44.265936  414292 fix.go:54] fixHost starting: 
	I1217 20:25:44.266187  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:44.282574  414292 fix.go:112] recreateIfNeeded on functional-682596: state=Running err=<nil>
	W1217 20:25:44.282603  414292 fix.go:138] unexpected machine state, will restart: <nil>
	I1217 20:25:44.285918  414292 out.go:252] * Updating the running docker "functional-682596" container ...
	I1217 20:25:44.285950  414292 machine.go:94] provisionDockerMachine start ...
	I1217 20:25:44.286031  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:44.302759  414292 main.go:143] libmachine: Using SSH client type: native
	I1217 20:25:44.303096  414292 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:25:44.303111  414292 main.go:143] libmachine: About to run SSH command:
	hostname
	I1217 20:25:44.431913  414292 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:25:44.431939  414292 ubuntu.go:182] provisioning hostname "functional-682596"
	I1217 20:25:44.432002  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:44.450770  414292 main.go:143] libmachine: Using SSH client type: native
	I1217 20:25:44.451117  414292 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:25:44.451136  414292 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-682596 && echo "functional-682596" | sudo tee /etc/hostname
	I1217 20:25:44.601580  414292 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:25:44.601732  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:44.619103  414292 main.go:143] libmachine: Using SSH client type: native
	I1217 20:25:44.619412  414292 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:25:44.619435  414292 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-682596' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-682596/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-682596' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1217 20:25:44.748545  414292 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1217 20:25:44.748571  414292 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21808-367595/.minikube CaCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21808-367595/.minikube}
	I1217 20:25:44.748593  414292 ubuntu.go:190] setting up certificates
	I1217 20:25:44.748603  414292 provision.go:84] configureAuth start
	I1217 20:25:44.748675  414292 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:25:44.766057  414292 provision.go:143] copyHostCerts
	I1217 20:25:44.766100  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem
	I1217 20:25:44.766141  414292 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem, removing ...
	I1217 20:25:44.766152  414292 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem
	I1217 20:25:44.766226  414292 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem (1082 bytes)
	I1217 20:25:44.766327  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem
	I1217 20:25:44.766347  414292 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem, removing ...
	I1217 20:25:44.766357  414292 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem
	I1217 20:25:44.766385  414292 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem (1123 bytes)
	I1217 20:25:44.766441  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem
	I1217 20:25:44.766461  414292 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem, removing ...
	I1217 20:25:44.766471  414292 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem
	I1217 20:25:44.766501  414292 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem (1679 bytes)
	I1217 20:25:44.766561  414292 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem org=jenkins.functional-682596 san=[127.0.0.1 192.168.49.2 functional-682596 localhost minikube]
	I1217 20:25:45.107844  414292 provision.go:177] copyRemoteCerts
	I1217 20:25:45.108657  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1217 20:25:45.108873  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.149674  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.277212  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1217 20:25:45.277284  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1217 20:25:45.298737  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1217 20:25:45.298796  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1217 20:25:45.320659  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1217 20:25:45.320720  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I1217 20:25:45.338755  414292 provision.go:87] duration metric: took 590.128101ms to configureAuth
	I1217 20:25:45.338800  414292 ubuntu.go:206] setting minikube options for container-runtime
	I1217 20:25:45.338978  414292 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:25:45.339040  414292 machine.go:97] duration metric: took 1.053082119s to provisionDockerMachine
	I1217 20:25:45.339048  414292 start.go:293] postStartSetup for "functional-682596" (driver="docker")
	I1217 20:25:45.339059  414292 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1217 20:25:45.339122  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1217 20:25:45.339165  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.356059  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.452345  414292 ssh_runner.go:195] Run: cat /etc/os-release
	I1217 20:25:45.455946  414292 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1217 20:25:45.455965  414292 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1217 20:25:45.455970  414292 command_runner.go:130] > VERSION_ID="12"
	I1217 20:25:45.455975  414292 command_runner.go:130] > VERSION="12 (bookworm)"
	I1217 20:25:45.455980  414292 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1217 20:25:45.455983  414292 command_runner.go:130] > ID=debian
	I1217 20:25:45.455989  414292 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1217 20:25:45.455994  414292 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1217 20:25:45.456008  414292 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1217 20:25:45.456046  414292 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1217 20:25:45.456062  414292 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1217 20:25:45.456073  414292 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/addons for local assets ...
	I1217 20:25:45.456130  414292 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/files for local assets ...
	I1217 20:25:45.456208  414292 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> 3694612.pem in /etc/ssl/certs
	I1217 20:25:45.456215  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> /etc/ssl/certs/3694612.pem
	I1217 20:25:45.456308  414292 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts -> hosts in /etc/test/nested/copy/369461
	I1217 20:25:45.456313  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts -> /etc/test/nested/copy/369461/hosts
	I1217 20:25:45.456356  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/369461
	I1217 20:25:45.464083  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:25:45.481460  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts --> /etc/test/nested/copy/369461/hosts (40 bytes)
	I1217 20:25:45.500420  414292 start.go:296] duration metric: took 161.357637ms for postStartSetup
	I1217 20:25:45.500542  414292 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1217 20:25:45.500615  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.517677  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.609195  414292 command_runner.go:130] > 18%
	I1217 20:25:45.609800  414292 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1217 20:25:45.614741  414292 command_runner.go:130] > 159G
	I1217 20:25:45.614774  414292 fix.go:56] duration metric: took 1.348835133s for fixHost
	I1217 20:25:45.614785  414292 start.go:83] releasing machines lock for "functional-682596", held for 1.348870218s
	I1217 20:25:45.614866  414292 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:25:45.631621  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:25:45.631685  414292 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:25:45.631702  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:25:45.631735  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:25:45.631767  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:25:45.631798  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:25:45.631848  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:25:45.631888  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.631907  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.631926  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem -> /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.631943  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:25:45.631995  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.649517  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.754346  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:25:45.772163  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:25:45.789636  414292 ssh_runner.go:195] Run: openssl version
	I1217 20:25:45.795706  414292 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1217 20:25:45.796203  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.803937  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:25:45.811516  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.815311  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.815389  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.815474  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.856132  414292 command_runner.go:130] > 3ec20f2e
	I1217 20:25:45.856705  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:25:45.864064  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.871519  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:25:45.879293  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.883196  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.883238  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.883306  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.924322  414292 command_runner.go:130] > b5213941
	I1217 20:25:45.924802  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:25:45.932259  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.939603  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:25:45.947311  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.950955  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.951320  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.951411  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.993968  414292 command_runner.go:130] > 51391683
	I1217 20:25:45.994167  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:25:46.002855  414292 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-certificates >/dev/null 2>&1 && sudo update-ca-certificates || true"
	I1217 20:25:46.007551  414292 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-trust >/dev/null 2>&1 && sudo update-ca-trust extract || true"
	I1217 20:25:46.011748  414292 ssh_runner.go:195] Run: cat /version.json
	I1217 20:25:46.011837  414292 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1217 20:25:46.016112  414292 command_runner.go:130] > {"iso_version": "v1.37.0-1765579389-22117", "kicbase_version": "v0.0.48-1765661130-22141", "minikube_version": "v1.37.0", "commit": "cbb33128a244032d08f8fc6e6c9f03b30f0da3e4"}
	I1217 20:25:46.018576  414292 ssh_runner.go:195] Run: systemctl --version
	I1217 20:25:46.126907  414292 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1217 20:25:46.127016  414292 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1217 20:25:46.127060  414292 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1217 20:25:46.127172  414292 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1217 20:25:46.131726  414292 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1217 20:25:46.131887  414292 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1217 20:25:46.131965  414292 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1217 20:25:46.140024  414292 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1217 20:25:46.140047  414292 start.go:496] detecting cgroup driver to use...
	I1217 20:25:46.140078  414292 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1217 20:25:46.140156  414292 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1217 20:25:46.155753  414292 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1217 20:25:46.168916  414292 docker.go:218] disabling cri-docker service (if available) ...
	I1217 20:25:46.169009  414292 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1217 20:25:46.184457  414292 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1217 20:25:46.197441  414292 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1217 20:25:46.302684  414292 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1217 20:25:46.421553  414292 docker.go:234] disabling docker service ...
	I1217 20:25:46.421621  414292 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1217 20:25:46.436823  414292 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1217 20:25:46.449890  414292 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1217 20:25:46.565021  414292 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1217 20:25:46.678341  414292 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1217 20:25:46.693104  414292 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1217 20:25:46.705993  414292 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1217 20:25:46.707385  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1217 20:25:46.716410  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1217 20:25:46.724756  414292 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1217 20:25:46.724876  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1217 20:25:46.733647  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:25:46.742030  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1217 20:25:46.750673  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:25:46.759312  414292 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1217 20:25:46.768595  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1217 20:25:46.777345  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1217 20:25:46.786196  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1217 20:25:46.795479  414292 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1217 20:25:46.802392  414292 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1217 20:25:46.803423  414292 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1217 20:25:46.811004  414292 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:25:46.926090  414292 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1217 20:25:47.068989  414292 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1217 20:25:47.069169  414292 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1217 20:25:47.073250  414292 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1217 20:25:47.073355  414292 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1217 20:25:47.073385  414292 command_runner.go:130] > Device: 0,72	Inode: 1618        Links: 1
	I1217 20:25:47.073441  414292 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1217 20:25:47.073470  414292 command_runner.go:130] > Access: 2025-12-17 20:25:47.016473578 +0000
	I1217 20:25:47.073512  414292 command_runner.go:130] > Modify: 2025-12-17 20:25:47.016473578 +0000
	I1217 20:25:47.073542  414292 command_runner.go:130] > Change: 2025-12-17 20:25:47.016473578 +0000
	I1217 20:25:47.073561  414292 command_runner.go:130] >  Birth: -
	I1217 20:25:47.073923  414292 start.go:564] Will wait 60s for crictl version
	I1217 20:25:47.074046  414292 ssh_runner.go:195] Run: which crictl
	I1217 20:25:47.077775  414292 command_runner.go:130] > /usr/local/bin/crictl
	I1217 20:25:47.078218  414292 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1217 20:25:47.104139  414292 command_runner.go:130] > Version:  0.1.0
	I1217 20:25:47.104225  414292 command_runner.go:130] > RuntimeName:  containerd
	I1217 20:25:47.104269  414292 command_runner.go:130] > RuntimeVersion:  v2.2.0
	I1217 20:25:47.104295  414292 command_runner.go:130] > RuntimeApiVersion:  v1
	I1217 20:25:47.106475  414292 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1217 20:25:47.106628  414292 ssh_runner.go:195] Run: containerd --version
	I1217 20:25:47.130403  414292 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1217 20:25:47.132698  414292 ssh_runner.go:195] Run: containerd --version
	I1217 20:25:47.152199  414292 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1217 20:25:47.159813  414292 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1217 20:25:47.162759  414292 cli_runner.go:164] Run: docker network inspect functional-682596 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1217 20:25:47.179237  414292 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1217 20:25:47.183476  414292 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1217 20:25:47.183701  414292 kubeadm.go:884] updating cluster {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false C
ustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1217 20:25:47.183825  414292 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:25:47.183890  414292 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:25:47.207538  414292 command_runner.go:130] > {
	I1217 20:25:47.207560  414292 command_runner.go:130] >   "images":  [
	I1217 20:25:47.207564  414292 command_runner.go:130] >     {
	I1217 20:25:47.207574  414292 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1217 20:25:47.207582  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207588  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1217 20:25:47.207591  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207595  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207607  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1217 20:25:47.207614  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207618  414292 command_runner.go:130] >       "size":  "40636774",
	I1217 20:25:47.207625  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207630  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207636  414292 command_runner.go:130] >     },
	I1217 20:25:47.207639  414292 command_runner.go:130] >     {
	I1217 20:25:47.207647  414292 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1217 20:25:47.207655  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207660  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1217 20:25:47.207664  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207668  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207678  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1217 20:25:47.207684  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207688  414292 command_runner.go:130] >       "size":  "8034419",
	I1217 20:25:47.207692  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207696  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207698  414292 command_runner.go:130] >     },
	I1217 20:25:47.207702  414292 command_runner.go:130] >     {
	I1217 20:25:47.207709  414292 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1217 20:25:47.207715  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207720  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1217 20:25:47.207735  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207747  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207756  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1217 20:25:47.207759  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207763  414292 command_runner.go:130] >       "size":  "21168808",
	I1217 20:25:47.207766  414292 command_runner.go:130] >       "username":  "nonroot",
	I1217 20:25:47.207770  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207773  414292 command_runner.go:130] >     },
	I1217 20:25:47.207776  414292 command_runner.go:130] >     {
	I1217 20:25:47.207783  414292 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1217 20:25:47.207787  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207791  414292 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1217 20:25:47.207795  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207798  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207806  414292 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1217 20:25:47.207809  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207813  414292 command_runner.go:130] >       "size":  "21749640",
	I1217 20:25:47.207817  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.207822  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.207826  414292 command_runner.go:130] >       },
	I1217 20:25:47.207833  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207837  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207842  414292 command_runner.go:130] >     },
	I1217 20:25:47.207846  414292 command_runner.go:130] >     {
	I1217 20:25:47.207853  414292 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1217 20:25:47.207859  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207865  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1217 20:25:47.207867  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207872  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207886  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1217 20:25:47.207890  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207894  414292 command_runner.go:130] >       "size":  "24692223",
	I1217 20:25:47.207897  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.207906  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.207915  414292 command_runner.go:130] >       },
	I1217 20:25:47.207928  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207932  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207934  414292 command_runner.go:130] >     },
	I1217 20:25:47.207938  414292 command_runner.go:130] >     {
	I1217 20:25:47.207947  414292 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1217 20:25:47.207955  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207961  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1217 20:25:47.207964  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207968  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207976  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1217 20:25:47.207982  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207986  414292 command_runner.go:130] >       "size":  "20672157",
	I1217 20:25:47.207990  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.207997  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.208001  414292 command_runner.go:130] >       },
	I1217 20:25:47.208020  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208028  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.208032  414292 command_runner.go:130] >     },
	I1217 20:25:47.208035  414292 command_runner.go:130] >     {
	I1217 20:25:47.208042  414292 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1217 20:25:47.208049  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.208054  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1217 20:25:47.208058  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208062  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.208069  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1217 20:25:47.208074  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208079  414292 command_runner.go:130] >       "size":  "22432301",
	I1217 20:25:47.208082  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208088  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.208091  414292 command_runner.go:130] >     },
	I1217 20:25:47.208097  414292 command_runner.go:130] >     {
	I1217 20:25:47.208104  414292 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1217 20:25:47.208114  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.208120  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1217 20:25:47.208123  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208128  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.208142  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1217 20:25:47.208146  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208149  414292 command_runner.go:130] >       "size":  "15405535",
	I1217 20:25:47.208153  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.208157  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.208163  414292 command_runner.go:130] >       },
	I1217 20:25:47.208168  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208173  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.208177  414292 command_runner.go:130] >     },
	I1217 20:25:47.208183  414292 command_runner.go:130] >     {
	I1217 20:25:47.208189  414292 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1217 20:25:47.208195  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.208200  414292 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1217 20:25:47.208203  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208207  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.208215  414292 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1217 20:25:47.208221  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208225  414292 command_runner.go:130] >       "size":  "267939",
	I1217 20:25:47.208229  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.208233  414292 command_runner.go:130] >         "value":  "65535"
	I1217 20:25:47.208237  414292 command_runner.go:130] >       },
	I1217 20:25:47.208240  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208245  414292 command_runner.go:130] >       "pinned":  true
	I1217 20:25:47.208339  414292 command_runner.go:130] >     }
	I1217 20:25:47.208342  414292 command_runner.go:130] >   ]
	I1217 20:25:47.208344  414292 command_runner.go:130] > }
	I1217 20:25:47.208525  414292 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:25:47.208539  414292 containerd.go:534] Images already preloaded, skipping extraction
	I1217 20:25:47.208601  414292 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:25:47.230634  414292 command_runner.go:130] > {
	I1217 20:25:47.230653  414292 command_runner.go:130] >   "images":  [
	I1217 20:25:47.230659  414292 command_runner.go:130] >     {
	I1217 20:25:47.230668  414292 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1217 20:25:47.230673  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230679  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1217 20:25:47.230683  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230687  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230696  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1217 20:25:47.230703  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230721  414292 command_runner.go:130] >       "size":  "40636774",
	I1217 20:25:47.230725  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.230729  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.230735  414292 command_runner.go:130] >     },
	I1217 20:25:47.230741  414292 command_runner.go:130] >     {
	I1217 20:25:47.230756  414292 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1217 20:25:47.230764  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230769  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1217 20:25:47.230773  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230786  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230798  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1217 20:25:47.230801  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230812  414292 command_runner.go:130] >       "size":  "8034419",
	I1217 20:25:47.230816  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.230819  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.230823  414292 command_runner.go:130] >     },
	I1217 20:25:47.230826  414292 command_runner.go:130] >     {
	I1217 20:25:47.230833  414292 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1217 20:25:47.230839  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230844  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1217 20:25:47.230857  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230888  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230900  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1217 20:25:47.230911  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230916  414292 command_runner.go:130] >       "size":  "21168808",
	I1217 20:25:47.230923  414292 command_runner.go:130] >       "username":  "nonroot",
	I1217 20:25:47.230927  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.230936  414292 command_runner.go:130] >     },
	I1217 20:25:47.230939  414292 command_runner.go:130] >     {
	I1217 20:25:47.230946  414292 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1217 20:25:47.230950  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230954  414292 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1217 20:25:47.230960  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230964  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230972  414292 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1217 20:25:47.230984  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230988  414292 command_runner.go:130] >       "size":  "21749640",
	I1217 20:25:47.230991  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.230995  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.230998  414292 command_runner.go:130] >       },
	I1217 20:25:47.231003  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231009  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231012  414292 command_runner.go:130] >     },
	I1217 20:25:47.231018  414292 command_runner.go:130] >     {
	I1217 20:25:47.231024  414292 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1217 20:25:47.231037  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231042  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1217 20:25:47.231045  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231050  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231063  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1217 20:25:47.231067  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231071  414292 command_runner.go:130] >       "size":  "24692223",
	I1217 20:25:47.231074  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231087  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.231093  414292 command_runner.go:130] >       },
	I1217 20:25:47.231097  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231111  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231117  414292 command_runner.go:130] >     },
	I1217 20:25:47.231125  414292 command_runner.go:130] >     {
	I1217 20:25:47.231132  414292 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1217 20:25:47.231138  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231144  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1217 20:25:47.231151  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231155  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231164  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1217 20:25:47.231168  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231172  414292 command_runner.go:130] >       "size":  "20672157",
	I1217 20:25:47.231178  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231194  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.231200  414292 command_runner.go:130] >       },
	I1217 20:25:47.231204  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231208  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231211  414292 command_runner.go:130] >     },
	I1217 20:25:47.231214  414292 command_runner.go:130] >     {
	I1217 20:25:47.231223  414292 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1217 20:25:47.231238  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231246  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1217 20:25:47.231250  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231254  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231264  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1217 20:25:47.231276  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231280  414292 command_runner.go:130] >       "size":  "22432301",
	I1217 20:25:47.231284  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231288  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231291  414292 command_runner.go:130] >     },
	I1217 20:25:47.231294  414292 command_runner.go:130] >     {
	I1217 20:25:47.231309  414292 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1217 20:25:47.231317  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231323  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1217 20:25:47.231333  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231337  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231347  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1217 20:25:47.231359  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231363  414292 command_runner.go:130] >       "size":  "15405535",
	I1217 20:25:47.231366  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231370  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.231373  414292 command_runner.go:130] >       },
	I1217 20:25:47.231379  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231392  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231395  414292 command_runner.go:130] >     },
	I1217 20:25:47.231405  414292 command_runner.go:130] >     {
	I1217 20:25:47.231412  414292 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1217 20:25:47.231418  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231423  414292 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1217 20:25:47.231428  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231437  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231445  414292 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1217 20:25:47.231448  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231452  414292 command_runner.go:130] >       "size":  "267939",
	I1217 20:25:47.231455  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231459  414292 command_runner.go:130] >         "value":  "65535"
	I1217 20:25:47.231462  414292 command_runner.go:130] >       },
	I1217 20:25:47.231466  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231469  414292 command_runner.go:130] >       "pinned":  true
	I1217 20:25:47.231473  414292 command_runner.go:130] >     }
	I1217 20:25:47.231479  414292 command_runner.go:130] >   ]
	I1217 20:25:47.231482  414292 command_runner.go:130] > }
	I1217 20:25:47.233897  414292 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:25:47.233919  414292 cache_images.go:86] Images are preloaded, skipping loading
	I1217 20:25:47.233928  414292 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1217 20:25:47.234041  414292 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-682596 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1217 20:25:47.234107  414292 ssh_runner.go:195] Run: sudo crictl info
	I1217 20:25:47.256786  414292 command_runner.go:130] > {
	I1217 20:25:47.256808  414292 command_runner.go:130] >   "cniconfig": {
	I1217 20:25:47.256814  414292 command_runner.go:130] >     "Networks": [
	I1217 20:25:47.256818  414292 command_runner.go:130] >       {
	I1217 20:25:47.256823  414292 command_runner.go:130] >         "Config": {
	I1217 20:25:47.256827  414292 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1217 20:25:47.256833  414292 command_runner.go:130] >           "Name": "cni-loopback",
	I1217 20:25:47.256837  414292 command_runner.go:130] >           "Plugins": [
	I1217 20:25:47.256840  414292 command_runner.go:130] >             {
	I1217 20:25:47.256846  414292 command_runner.go:130] >               "Network": {
	I1217 20:25:47.256851  414292 command_runner.go:130] >                 "ipam": {},
	I1217 20:25:47.256863  414292 command_runner.go:130] >                 "type": "loopback"
	I1217 20:25:47.256875  414292 command_runner.go:130] >               },
	I1217 20:25:47.256880  414292 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1217 20:25:47.256883  414292 command_runner.go:130] >             }
	I1217 20:25:47.256887  414292 command_runner.go:130] >           ],
	I1217 20:25:47.256896  414292 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1217 20:25:47.256900  414292 command_runner.go:130] >         },
	I1217 20:25:47.256911  414292 command_runner.go:130] >         "IFName": "lo"
	I1217 20:25:47.256917  414292 command_runner.go:130] >       }
	I1217 20:25:47.256920  414292 command_runner.go:130] >     ],
	I1217 20:25:47.256924  414292 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1217 20:25:47.256927  414292 command_runner.go:130] >     "PluginDirs": [
	I1217 20:25:47.256932  414292 command_runner.go:130] >       "/opt/cni/bin"
	I1217 20:25:47.256941  414292 command_runner.go:130] >     ],
	I1217 20:25:47.256945  414292 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1217 20:25:47.256949  414292 command_runner.go:130] >     "Prefix": "eth"
	I1217 20:25:47.256952  414292 command_runner.go:130] >   },
	I1217 20:25:47.256957  414292 command_runner.go:130] >   "config": {
	I1217 20:25:47.256962  414292 command_runner.go:130] >     "cdiSpecDirs": [
	I1217 20:25:47.256965  414292 command_runner.go:130] >       "/etc/cdi",
	I1217 20:25:47.256969  414292 command_runner.go:130] >       "/var/run/cdi"
	I1217 20:25:47.256977  414292 command_runner.go:130] >     ],
	I1217 20:25:47.256985  414292 command_runner.go:130] >     "cni": {
	I1217 20:25:47.256991  414292 command_runner.go:130] >       "binDir": "",
	I1217 20:25:47.256995  414292 command_runner.go:130] >       "binDirs": [
	I1217 20:25:47.256999  414292 command_runner.go:130] >         "/opt/cni/bin"
	I1217 20:25:47.257003  414292 command_runner.go:130] >       ],
	I1217 20:25:47.257008  414292 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1217 20:25:47.257025  414292 command_runner.go:130] >       "confTemplate": "",
	I1217 20:25:47.257029  414292 command_runner.go:130] >       "ipPref": "",
	I1217 20:25:47.257033  414292 command_runner.go:130] >       "maxConfNum": 1,
	I1217 20:25:47.257040  414292 command_runner.go:130] >       "setupSerially": false,
	I1217 20:25:47.257044  414292 command_runner.go:130] >       "useInternalLoopback": false
	I1217 20:25:47.257049  414292 command_runner.go:130] >     },
	I1217 20:25:47.257057  414292 command_runner.go:130] >     "containerd": {
	I1217 20:25:47.257061  414292 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1217 20:25:47.257069  414292 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1217 20:25:47.257076  414292 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1217 20:25:47.257080  414292 command_runner.go:130] >       "runtimes": {
	I1217 20:25:47.257084  414292 command_runner.go:130] >         "runc": {
	I1217 20:25:47.257097  414292 command_runner.go:130] >           "ContainerAnnotations": null,
	I1217 20:25:47.257102  414292 command_runner.go:130] >           "PodAnnotations": null,
	I1217 20:25:47.257106  414292 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1217 20:25:47.257111  414292 command_runner.go:130] >           "cgroupWritable": false,
	I1217 20:25:47.257119  414292 command_runner.go:130] >           "cniConfDir": "",
	I1217 20:25:47.257123  414292 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1217 20:25:47.257127  414292 command_runner.go:130] >           "io_type": "",
	I1217 20:25:47.257133  414292 command_runner.go:130] >           "options": {
	I1217 20:25:47.257139  414292 command_runner.go:130] >             "BinaryName": "",
	I1217 20:25:47.257143  414292 command_runner.go:130] >             "CriuImagePath": "",
	I1217 20:25:47.257148  414292 command_runner.go:130] >             "CriuWorkPath": "",
	I1217 20:25:47.257154  414292 command_runner.go:130] >             "IoGid": 0,
	I1217 20:25:47.257158  414292 command_runner.go:130] >             "IoUid": 0,
	I1217 20:25:47.257162  414292 command_runner.go:130] >             "NoNewKeyring": false,
	I1217 20:25:47.257174  414292 command_runner.go:130] >             "Root": "",
	I1217 20:25:47.257186  414292 command_runner.go:130] >             "ShimCgroup": "",
	I1217 20:25:47.257193  414292 command_runner.go:130] >             "SystemdCgroup": false
	I1217 20:25:47.257196  414292 command_runner.go:130] >           },
	I1217 20:25:47.257206  414292 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1217 20:25:47.257213  414292 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1217 20:25:47.257217  414292 command_runner.go:130] >           "runtimePath": "",
	I1217 20:25:47.257224  414292 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1217 20:25:47.257229  414292 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1217 20:25:47.257233  414292 command_runner.go:130] >           "snapshotter": ""
	I1217 20:25:47.257238  414292 command_runner.go:130] >         }
	I1217 20:25:47.257241  414292 command_runner.go:130] >       }
	I1217 20:25:47.257246  414292 command_runner.go:130] >     },
	I1217 20:25:47.257261  414292 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1217 20:25:47.257269  414292 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1217 20:25:47.257274  414292 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1217 20:25:47.257280  414292 command_runner.go:130] >     "disableApparmor": false,
	I1217 20:25:47.257290  414292 command_runner.go:130] >     "disableHugetlbController": true,
	I1217 20:25:47.257294  414292 command_runner.go:130] >     "disableProcMount": false,
	I1217 20:25:47.257299  414292 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1217 20:25:47.257303  414292 command_runner.go:130] >     "enableCDI": true,
	I1217 20:25:47.257309  414292 command_runner.go:130] >     "enableSelinux": false,
	I1217 20:25:47.257313  414292 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1217 20:25:47.257318  414292 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1217 20:25:47.257325  414292 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1217 20:25:47.257331  414292 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1217 20:25:47.257336  414292 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1217 20:25:47.257340  414292 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1217 20:25:47.257353  414292 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1217 20:25:47.257358  414292 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1217 20:25:47.257362  414292 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1217 20:25:47.257368  414292 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1217 20:25:47.257375  414292 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1217 20:25:47.257379  414292 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1217 20:25:47.257386  414292 command_runner.go:130] >   },
	I1217 20:25:47.257390  414292 command_runner.go:130] >   "features": {
	I1217 20:25:47.257396  414292 command_runner.go:130] >     "supplemental_groups_policy": true
	I1217 20:25:47.257399  414292 command_runner.go:130] >   },
	I1217 20:25:47.257403  414292 command_runner.go:130] >   "golang": "go1.24.9",
	I1217 20:25:47.257416  414292 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1217 20:25:47.257429  414292 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1217 20:25:47.257433  414292 command_runner.go:130] >   "runtimeHandlers": [
	I1217 20:25:47.257436  414292 command_runner.go:130] >     {
	I1217 20:25:47.257447  414292 command_runner.go:130] >       "features": {
	I1217 20:25:47.257451  414292 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1217 20:25:47.257455  414292 command_runner.go:130] >         "user_namespaces": true
	I1217 20:25:47.257460  414292 command_runner.go:130] >       }
	I1217 20:25:47.257463  414292 command_runner.go:130] >     },
	I1217 20:25:47.257469  414292 command_runner.go:130] >     {
	I1217 20:25:47.257473  414292 command_runner.go:130] >       "features": {
	I1217 20:25:47.257477  414292 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1217 20:25:47.257481  414292 command_runner.go:130] >         "user_namespaces": true
	I1217 20:25:47.257484  414292 command_runner.go:130] >       },
	I1217 20:25:47.257488  414292 command_runner.go:130] >       "name": "runc"
	I1217 20:25:47.257494  414292 command_runner.go:130] >     }
	I1217 20:25:47.257497  414292 command_runner.go:130] >   ],
	I1217 20:25:47.257502  414292 command_runner.go:130] >   "status": {
	I1217 20:25:47.257506  414292 command_runner.go:130] >     "conditions": [
	I1217 20:25:47.257509  414292 command_runner.go:130] >       {
	I1217 20:25:47.257514  414292 command_runner.go:130] >         "message": "",
	I1217 20:25:47.257526  414292 command_runner.go:130] >         "reason": "",
	I1217 20:25:47.257530  414292 command_runner.go:130] >         "status": true,
	I1217 20:25:47.257536  414292 command_runner.go:130] >         "type": "RuntimeReady"
	I1217 20:25:47.257539  414292 command_runner.go:130] >       },
	I1217 20:25:47.257543  414292 command_runner.go:130] >       {
	I1217 20:25:47.257549  414292 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1217 20:25:47.257554  414292 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1217 20:25:47.257563  414292 command_runner.go:130] >         "status": false,
	I1217 20:25:47.257568  414292 command_runner.go:130] >         "type": "NetworkReady"
	I1217 20:25:47.257574  414292 command_runner.go:130] >       },
	I1217 20:25:47.257577  414292 command_runner.go:130] >       {
	I1217 20:25:47.257599  414292 command_runner.go:130] >         "message": "{\"io.containerd.deprecation/cgroup-v1\":\"The support for cgroup v1 is deprecated since containerd v2.2 and will be removed by no later than May 2029. Upgrade the host to use cgroup v2.\"}",
	I1217 20:25:47.257609  414292 command_runner.go:130] >         "reason": "ContainerdHasDeprecationWarnings",
	I1217 20:25:47.257615  414292 command_runner.go:130] >         "status": false,
	I1217 20:25:47.257620  414292 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1217 20:25:47.257626  414292 command_runner.go:130] >       }
	I1217 20:25:47.257629  414292 command_runner.go:130] >     ]
	I1217 20:25:47.257631  414292 command_runner.go:130] >   }
	I1217 20:25:47.257634  414292 command_runner.go:130] > }
	I1217 20:25:47.259959  414292 cni.go:84] Creating CNI manager for ""
	I1217 20:25:47.259981  414292 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:25:47.259991  414292 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1217 20:25:47.260020  414292 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-682596 NodeName:functional-682596 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Stati
cPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1217 20:25:47.260142  414292 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-682596"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1217 20:25:47.260216  414292 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1217 20:25:47.267498  414292 command_runner.go:130] > kubeadm
	I1217 20:25:47.267517  414292 command_runner.go:130] > kubectl
	I1217 20:25:47.267520  414292 command_runner.go:130] > kubelet
	I1217 20:25:47.268462  414292 binaries.go:51] Found k8s binaries, skipping transfer
	I1217 20:25:47.268563  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1217 20:25:47.276438  414292 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1217 20:25:47.289778  414292 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1217 20:25:47.303155  414292 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2235 bytes)
	I1217 20:25:47.315864  414292 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1217 20:25:47.319319  414292 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1217 20:25:47.319605  414292 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:25:47.441462  414292 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1217 20:25:47.463080  414292 certs.go:69] Setting up /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596 for IP: 192.168.49.2
	I1217 20:25:47.463150  414292 certs.go:195] generating shared ca certs ...
	I1217 20:25:47.463190  414292 certs.go:227] acquiring lock for ca certs: {Name:mk528c7ee25f2f3d78de33f266a77f908cb2a9d0 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:47.463362  414292 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key
	I1217 20:25:47.463461  414292 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key
	I1217 20:25:47.463501  414292 certs.go:257] generating profile certs ...
	I1217 20:25:47.463662  414292 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key
	I1217 20:25:47.463774  414292 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key.0c30bf8d
	I1217 20:25:47.463860  414292 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key
	I1217 20:25:47.463894  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1217 20:25:47.463938  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1217 20:25:47.463977  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1217 20:25:47.464005  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1217 20:25:47.464049  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1217 20:25:47.464079  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1217 20:25:47.464117  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1217 20:25:47.464151  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1217 20:25:47.464241  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:25:47.464342  414292 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:25:47.464377  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:25:47.464421  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:25:47.464488  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:25:47.464541  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:25:47.464629  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:25:47.464693  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.464733  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.464771  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem -> /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.469220  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1217 20:25:47.495389  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1217 20:25:47.516308  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1217 20:25:47.535144  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1217 20:25:47.552466  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1217 20:25:47.570909  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1217 20:25:47.588173  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1217 20:25:47.606011  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1217 20:25:47.623433  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:25:47.640520  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:25:47.657751  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:25:47.675695  414292 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1217 20:25:47.688487  414292 ssh_runner.go:195] Run: openssl version
	I1217 20:25:47.694560  414292 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1217 20:25:47.694946  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.702368  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:25:47.710124  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.713826  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.713858  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.713917  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.754917  414292 command_runner.go:130] > 3ec20f2e
	I1217 20:25:47.755445  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:25:47.763008  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.770327  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:25:47.778030  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.782014  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.782042  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.782099  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.822920  414292 command_runner.go:130] > b5213941
	I1217 20:25:47.823058  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:25:47.830582  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.837906  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:25:47.845640  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.849463  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.849531  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.849600  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.890040  414292 command_runner.go:130] > 51391683
	I1217 20:25:47.890555  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:25:47.898150  414292 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1217 20:25:47.901790  414292 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1217 20:25:47.901872  414292 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1217 20:25:47.901887  414292 command_runner.go:130] > Device: 259,1	Inode: 1060771     Links: 1
	I1217 20:25:47.901895  414292 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1217 20:25:47.901902  414292 command_runner.go:130] > Access: 2025-12-17 20:21:41.033930957 +0000
	I1217 20:25:47.901907  414292 command_runner.go:130] > Modify: 2025-12-17 20:17:35.731490416 +0000
	I1217 20:25:47.901912  414292 command_runner.go:130] > Change: 2025-12-17 20:17:35.731490416 +0000
	I1217 20:25:47.901921  414292 command_runner.go:130] >  Birth: 2025-12-17 20:17:35.731490416 +0000
	I1217 20:25:47.901988  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1217 20:25:47.942293  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:47.942780  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1217 20:25:47.983019  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:47.983513  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1217 20:25:48.024341  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.024837  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1217 20:25:48.065771  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.066190  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1217 20:25:48.107223  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.107692  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1217 20:25:48.148374  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.148810  414292 kubeadm.go:401] StartCluster: {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Cust
omQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:25:48.148912  414292 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1217 20:25:48.148983  414292 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1217 20:25:48.175983  414292 cri.go:89] found id: ""
	I1217 20:25:48.176056  414292 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1217 20:25:48.182939  414292 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1217 20:25:48.182960  414292 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1217 20:25:48.182967  414292 command_runner.go:130] > /var/lib/minikube/etcd:
	I1217 20:25:48.183854  414292 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1217 20:25:48.183910  414292 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1217 20:25:48.183977  414292 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1217 20:25:48.191197  414292 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:25:48.191635  414292 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-682596" does not appear in /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.191740  414292 kubeconfig.go:62] /home/jenkins/minikube-integration/21808-367595/kubeconfig needs updating (will repair): [kubeconfig missing "functional-682596" cluster setting kubeconfig missing "functional-682596" context setting]
	I1217 20:25:48.192034  414292 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/kubeconfig: {Name:mk68b516071fc5d9da0842bf56ff4d318cea3c03 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:48.192565  414292 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.192744  414292 kapi.go:59] client config for functional-682596: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt", KeyFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key", CAFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil),
NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1217 20:25:48.193250  414292 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1217 20:25:48.193273  414292 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1217 20:25:48.193281  414292 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1217 20:25:48.193286  414292 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1217 20:25:48.193293  414292 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1217 20:25:48.193576  414292 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1217 20:25:48.193650  414292 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1217 20:25:48.201269  414292 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1217 20:25:48.201338  414292 kubeadm.go:602] duration metric: took 17.417602ms to restartPrimaryControlPlane
	I1217 20:25:48.201355  414292 kubeadm.go:403] duration metric: took 52.552362ms to StartCluster
	I1217 20:25:48.201370  414292 settings.go:142] acquiring lock: {Name:mkec67bf414aabef990098a6cc4910956f0d3622 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:48.201429  414292 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.202007  414292 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/kubeconfig: {Name:mk68b516071fc5d9da0842bf56ff4d318cea3c03 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:48.202208  414292 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1217 20:25:48.202539  414292 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:25:48.202581  414292 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1217 20:25:48.202699  414292 addons.go:70] Setting storage-provisioner=true in profile "functional-682596"
	I1217 20:25:48.202717  414292 addons.go:239] Setting addon storage-provisioner=true in "functional-682596"
	I1217 20:25:48.202742  414292 host.go:66] Checking if "functional-682596" exists ...
	I1217 20:25:48.202770  414292 addons.go:70] Setting default-storageclass=true in profile "functional-682596"
	I1217 20:25:48.202806  414292 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-682596"
	I1217 20:25:48.203165  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:48.203224  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:48.208687  414292 out.go:179] * Verifying Kubernetes components...
	I1217 20:25:48.211692  414292 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:25:48.230383  414292 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1217 20:25:48.233339  414292 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:48.233361  414292 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1217 20:25:48.233423  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:48.236813  414292 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.236975  414292 kapi.go:59] client config for functional-682596: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt", KeyFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key", CAFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil),
NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1217 20:25:48.237238  414292 addons.go:239] Setting addon default-storageclass=true in "functional-682596"
	I1217 20:25:48.237267  414292 host.go:66] Checking if "functional-682596" exists ...
	I1217 20:25:48.237711  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:48.262897  414292 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:48.262919  414292 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1217 20:25:48.262996  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:48.269972  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:48.294767  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:48.418586  414292 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1217 20:25:48.450623  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:48.465245  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:49.239916  414292 node_ready.go:35] waiting up to 6m0s for node "functional-682596" to be "Ready" ...
	I1217 20:25:49.240030  414292 type.go:168] "Request Body" body=""
	I1217 20:25:49.240095  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:49.240342  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.240376  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240403  414292 retry.go:31] will retry after 252.350229ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240440  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.240459  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240479  414292 retry.go:31] will retry after 321.821783ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240555  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:49.493033  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:49.547929  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.551638  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.551667  414292 retry.go:31] will retry after 328.531722ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.562869  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:49.621023  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.625124  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.625209  414292 retry.go:31] will retry after 442.103425ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.740481  414292 type.go:168] "Request Body" body=""
	I1217 20:25:49.740559  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:49.740872  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:49.881274  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:49.942102  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.945784  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.945890  414292 retry.go:31] will retry after 409.243705ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.068055  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:50.127397  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:50.131721  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.131759  414292 retry.go:31] will retry after 566.560423ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.241000  414292 type.go:168] "Request Body" body=""
	I1217 20:25:50.241077  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:50.241406  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:50.355732  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:50.414970  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:50.419857  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.419893  414292 retry.go:31] will retry after 763.212709ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.699479  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:50.741041  414292 type.go:168] "Request Body" body=""
	I1217 20:25:50.741134  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:50.741465  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:50.776772  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:50.776815  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.776839  414292 retry.go:31] will retry after 1.24877806s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:51.183473  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:51.240182  414292 type.go:168] "Request Body" body=""
	I1217 20:25:51.240277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:51.240545  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:51.240594  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:51.251909  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:51.255943  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:51.255983  414292 retry.go:31] will retry after 1.271740821s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:51.740532  414292 type.go:168] "Request Body" body=""
	I1217 20:25:51.740649  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:51.740974  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:52.026483  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:52.095052  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:52.095119  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.095140  414292 retry.go:31] will retry after 1.58694383s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.240356  414292 type.go:168] "Request Body" body=""
	I1217 20:25:52.240430  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:52.240682  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:52.528382  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:52.586445  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:52.590032  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.590066  414292 retry.go:31] will retry after 1.445188932s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.740386  414292 type.go:168] "Request Body" body=""
	I1217 20:25:52.740463  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:52.740818  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:53.240182  414292 type.go:168] "Request Body" body=""
	I1217 20:25:53.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:53.240604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:53.240660  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:53.682297  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:53.740043  414292 type.go:168] "Request Body" body=""
	I1217 20:25:53.740108  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:53.740352  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:53.743851  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:53.743882  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:53.743900  414292 retry.go:31] will retry after 2.69671946s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:54.036496  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:54.096053  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:54.096099  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:54.096122  414292 retry.go:31] will retry after 2.925706415s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:54.240487  414292 type.go:168] "Request Body" body=""
	I1217 20:25:54.240571  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:54.240903  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:54.740656  414292 type.go:168] "Request Body" body=""
	I1217 20:25:54.740752  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:54.741104  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:55.240849  414292 type.go:168] "Request Body" body=""
	I1217 20:25:55.240918  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:55.241169  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:55.241222  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:55.741059  414292 type.go:168] "Request Body" body=""
	I1217 20:25:55.741137  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:55.741444  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:56.240196  414292 type.go:168] "Request Body" body=""
	I1217 20:25:56.240318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:56.240645  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:56.440979  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:56.500702  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:56.500749  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:56.500767  414292 retry.go:31] will retry after 1.84810195s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:56.740117  414292 type.go:168] "Request Body" body=""
	I1217 20:25:56.740201  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:56.740503  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:57.023057  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:57.082954  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:57.083001  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:57.083020  414292 retry.go:31] will retry after 3.223759279s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:57.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:25:57.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:57.240558  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:57.740268  414292 type.go:168] "Request Body" body=""
	I1217 20:25:57.740347  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:57.740685  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:57.740756  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:58.240571  414292 type.go:168] "Request Body" body=""
	I1217 20:25:58.240660  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:58.240952  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:58.349268  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:58.403710  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:58.407286  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:58.407317  414292 retry.go:31] will retry after 3.305771044s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:58.740858  414292 type.go:168] "Request Body" body=""
	I1217 20:25:58.740936  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:58.741275  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:59.240111  414292 type.go:168] "Request Body" body=""
	I1217 20:25:59.240220  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:59.240560  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:59.740145  414292 type.go:168] "Request Body" body=""
	I1217 20:25:59.740223  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:59.740492  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:00.240307  414292 type.go:168] "Request Body" body=""
	I1217 20:26:00.240425  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:00.240806  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:00.240857  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:00.307216  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:00.372358  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:00.376526  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:00.376564  414292 retry.go:31] will retry after 8.003704403s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:00.740135  414292 type.go:168] "Request Body" body=""
	I1217 20:26:00.740216  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:00.740543  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:01.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:26:01.240281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:01.240535  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:01.713237  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:01.740945  414292 type.go:168] "Request Body" body=""
	I1217 20:26:01.741019  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:01.741278  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:01.769053  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:01.772711  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:01.772742  414292 retry.go:31] will retry after 3.267552643s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:02.240198  414292 type.go:168] "Request Body" body=""
	I1217 20:26:02.240302  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:02.240604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:02.740266  414292 type.go:168] "Request Body" body=""
	I1217 20:26:02.740336  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:02.740681  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:02.740769  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:03.240210  414292 type.go:168] "Request Body" body=""
	I1217 20:26:03.240299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:03.240622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:03.740228  414292 type.go:168] "Request Body" body=""
	I1217 20:26:03.740320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:03.740637  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:04.240516  414292 type.go:168] "Request Body" body=""
	I1217 20:26:04.240588  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:04.240943  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:04.740734  414292 type.go:168] "Request Body" body=""
	I1217 20:26:04.740811  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:04.741190  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:04.741246  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:05.040756  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:05.102503  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:05.102552  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:05.102572  414292 retry.go:31] will retry after 12.344413157s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:05.240841  414292 type.go:168] "Request Body" body=""
	I1217 20:26:05.240913  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:05.241244  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:05.740855  414292 type.go:168] "Request Body" body=""
	I1217 20:26:05.740930  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:05.741188  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:06.241036  414292 type.go:168] "Request Body" body=""
	I1217 20:26:06.241119  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:06.241411  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:06.740129  414292 type.go:168] "Request Body" body=""
	I1217 20:26:06.740212  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:06.740571  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:07.240279  414292 type.go:168] "Request Body" body=""
	I1217 20:26:07.240353  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:07.240608  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:07.240657  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:07.740195  414292 type.go:168] "Request Body" body=""
	I1217 20:26:07.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:07.740591  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:08.240525  414292 type.go:168] "Request Body" body=""
	I1217 20:26:08.240599  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:08.240914  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:08.381383  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:08.435212  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:08.439369  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:08.439410  414292 retry.go:31] will retry after 8.892819822s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:08.740968  414292 type.go:168] "Request Body" body=""
	I1217 20:26:08.741065  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:08.741390  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:09.240148  414292 type.go:168] "Request Body" body=""
	I1217 20:26:09.240230  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:09.240616  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:09.740331  414292 type.go:168] "Request Body" body=""
	I1217 20:26:09.740408  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:09.740742  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:09.740801  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:10.240362  414292 type.go:168] "Request Body" body=""
	I1217 20:26:10.240435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:10.240780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:10.740200  414292 type.go:168] "Request Body" body=""
	I1217 20:26:10.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:10.740646  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:11.240216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:11.240308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:11.240651  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:11.740164  414292 type.go:168] "Request Body" body=""
	I1217 20:26:11.740235  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:11.740510  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:12.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:26:12.240283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:12.240625  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:12.240683  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:12.740202  414292 type.go:168] "Request Body" body=""
	I1217 20:26:12.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:12.740622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:13.240171  414292 type.go:168] "Request Body" body=""
	I1217 20:26:13.240266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:13.240526  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:13.740189  414292 type.go:168] "Request Body" body=""
	I1217 20:26:13.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:13.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:14.240635  414292 type.go:168] "Request Body" body=""
	I1217 20:26:14.240715  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:14.241059  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:14.241125  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:14.741075  414292 type.go:168] "Request Body" body=""
	I1217 20:26:14.741149  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:14.741406  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:15.240079  414292 type.go:168] "Request Body" body=""
	I1217 20:26:15.240155  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:15.240494  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:15.740230  414292 type.go:168] "Request Body" body=""
	I1217 20:26:15.740334  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:15.740683  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:16.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:26:16.240429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:16.240695  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:16.740381  414292 type.go:168] "Request Body" body=""
	I1217 20:26:16.740456  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:16.740780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:16.740834  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:17.240356  414292 type.go:168] "Request Body" body=""
	I1217 20:26:17.240434  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:17.240777  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:17.333063  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:17.388410  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:17.391967  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.391995  414292 retry.go:31] will retry after 13.113728844s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.447345  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:17.505124  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:17.505163  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.505182  414292 retry.go:31] will retry after 11.452403849s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.740560  414292 type.go:168] "Request Body" body=""
	I1217 20:26:17.740629  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:17.740885  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:18.240553  414292 type.go:168] "Request Body" body=""
	I1217 20:26:18.240633  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:18.240967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:18.740512  414292 type.go:168] "Request Body" body=""
	I1217 20:26:18.740589  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:18.740904  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:18.740955  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:19.240885  414292 type.go:168] "Request Body" body=""
	I1217 20:26:19.240962  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:19.241213  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:19.741015  414292 type.go:168] "Request Body" body=""
	I1217 20:26:19.741087  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:19.741404  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:20.240182  414292 type.go:168] "Request Body" body=""
	I1217 20:26:20.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:20.240627  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:20.740108  414292 type.go:168] "Request Body" body=""
	I1217 20:26:20.740183  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:20.740453  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:21.240216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:21.240318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:21.240698  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:21.240754  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:21.740194  414292 type.go:168] "Request Body" body=""
	I1217 20:26:21.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:21.740628  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:22.240933  414292 type.go:168] "Request Body" body=""
	I1217 20:26:22.241005  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:22.241257  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:22.741110  414292 type.go:168] "Request Body" body=""
	I1217 20:26:22.741224  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:22.741585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:23.240288  414292 type.go:168] "Request Body" body=""
	I1217 20:26:23.240369  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:23.240662  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:23.741096  414292 type.go:168] "Request Body" body=""
	I1217 20:26:23.741162  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:23.741447  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:23.741492  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:24.240108  414292 type.go:168] "Request Body" body=""
	I1217 20:26:24.240184  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:24.240503  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:24.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:24.740312  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:24.740605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:25.240111  414292 type.go:168] "Request Body" body=""
	I1217 20:26:25.240206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:25.240472  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:25.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:25.740317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:25.740610  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:26.240350  414292 type.go:168] "Request Body" body=""
	I1217 20:26:26.240423  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:26.240796  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:26.240856  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:26.740354  414292 type.go:168] "Request Body" body=""
	I1217 20:26:26.740433  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:26.740693  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:27.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:26:27.240285  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:27.240571  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:27.740306  414292 type.go:168] "Request Body" body=""
	I1217 20:26:27.740387  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:27.740718  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:28.240518  414292 type.go:168] "Request Body" body=""
	I1217 20:26:28.240588  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:28.240860  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:28.240905  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:28.740699  414292 type.go:168] "Request Body" body=""
	I1217 20:26:28.740776  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:28.741110  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:28.958534  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:29.018842  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:29.024509  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:29.024543  414292 retry.go:31] will retry after 28.006345092s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:29.241080  414292 type.go:168] "Request Body" body=""
	I1217 20:26:29.241159  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:29.241493  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:29.740997  414292 type.go:168] "Request Body" body=""
	I1217 20:26:29.741065  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:29.741356  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:30.241045  414292 type.go:168] "Request Body" body=""
	I1217 20:26:30.241120  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:30.241435  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:30.241493  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:30.505938  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:30.574101  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:30.574147  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:30.574166  414292 retry.go:31] will retry after 31.982210322s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:30.740490  414292 type.go:168] "Request Body" body=""
	I1217 20:26:30.740579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:30.740933  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:31.240692  414292 type.go:168] "Request Body" body=""
	I1217 20:26:31.240768  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:31.248432  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=7
	I1217 20:26:31.740179  414292 type.go:168] "Request Body" body=""
	I1217 20:26:31.740287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:31.740647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:32.240201  414292 type.go:168] "Request Body" body=""
	I1217 20:26:32.240304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:32.240630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:32.740084  414292 type.go:168] "Request Body" body=""
	I1217 20:26:32.740156  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:32.740461  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:32.740520  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:33.240200  414292 type.go:168] "Request Body" body=""
	I1217 20:26:33.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:33.240625  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:33.740223  414292 type.go:168] "Request Body" body=""
	I1217 20:26:33.740319  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:33.740635  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:34.240634  414292 type.go:168] "Request Body" body=""
	I1217 20:26:34.240711  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:34.241019  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:34.740715  414292 type.go:168] "Request Body" body=""
	I1217 20:26:34.740788  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:34.741122  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:34.741178  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:35.240958  414292 type.go:168] "Request Body" body=""
	I1217 20:26:35.241039  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:35.241368  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:35.740050  414292 type.go:168] "Request Body" body=""
	I1217 20:26:35.740126  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:35.740407  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:36.240170  414292 type.go:168] "Request Body" body=""
	I1217 20:26:36.240271  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:36.240623  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:36.740177  414292 type.go:168] "Request Body" body=""
	I1217 20:26:36.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:36.740609  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:37.240114  414292 type.go:168] "Request Body" body=""
	I1217 20:26:37.240213  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:37.240481  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:37.240522  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:37.740165  414292 type.go:168] "Request Body" body=""
	I1217 20:26:37.740239  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:37.740593  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:38.240448  414292 type.go:168] "Request Body" body=""
	I1217 20:26:38.240537  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:38.240850  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:38.740388  414292 type.go:168] "Request Body" body=""
	I1217 20:26:38.740461  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:38.740786  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:39.240612  414292 type.go:168] "Request Body" body=""
	I1217 20:26:39.240699  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:39.241070  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:39.241129  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:39.740905  414292 type.go:168] "Request Body" body=""
	I1217 20:26:39.740985  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:39.741321  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:40.240050  414292 type.go:168] "Request Body" body=""
	I1217 20:26:40.240123  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:40.240466  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:40.740172  414292 type.go:168] "Request Body" body=""
	I1217 20:26:40.740275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:40.740632  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:41.240336  414292 type.go:168] "Request Body" body=""
	I1217 20:26:41.240409  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:41.240744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:41.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:26:41.740423  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:41.740730  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:41.740787  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:42.240202  414292 type.go:168] "Request Body" body=""
	I1217 20:26:42.240311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:42.240673  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:42.740189  414292 type.go:168] "Request Body" body=""
	I1217 20:26:42.740284  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:42.740581  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:43.240122  414292 type.go:168] "Request Body" body=""
	I1217 20:26:43.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:43.240458  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:43.740180  414292 type.go:168] "Request Body" body=""
	I1217 20:26:43.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:43.740597  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:44.240188  414292 type.go:168] "Request Body" body=""
	I1217 20:26:44.240284  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:44.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:44.240672  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:44.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:26:44.740427  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:44.740701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:45.240268  414292 type.go:168] "Request Body" body=""
	I1217 20:26:45.240370  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:45.240810  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:45.740385  414292 type.go:168] "Request Body" body=""
	I1217 20:26:45.740481  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:45.740887  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:46.240669  414292 type.go:168] "Request Body" body=""
	I1217 20:26:46.240750  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:46.241005  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:46.241046  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:46.740832  414292 type.go:168] "Request Body" body=""
	I1217 20:26:46.740907  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:46.741230  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:47.241112  414292 type.go:168] "Request Body" body=""
	I1217 20:26:47.241195  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:47.241535  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:47.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:47.740311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:47.740564  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:48.240565  414292 type.go:168] "Request Body" body=""
	I1217 20:26:48.240648  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:48.240997  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:48.740812  414292 type.go:168] "Request Body" body=""
	I1217 20:26:48.740893  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:48.741250  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:48.741305  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:49.241092  414292 type.go:168] "Request Body" body=""
	I1217 20:26:49.241159  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:49.241410  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:49.740094  414292 type.go:168] "Request Body" body=""
	I1217 20:26:49.740170  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:49.740483  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:50.240219  414292 type.go:168] "Request Body" body=""
	I1217 20:26:50.240334  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:50.240696  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:50.740130  414292 type.go:168] "Request Body" body=""
	I1217 20:26:50.740210  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:50.740538  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:51.240180  414292 type.go:168] "Request Body" body=""
	I1217 20:26:51.240279  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:51.240607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:51.240658  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:51.740209  414292 type.go:168] "Request Body" body=""
	I1217 20:26:51.740323  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:51.740662  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:52.240355  414292 type.go:168] "Request Body" body=""
	I1217 20:26:52.240429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:52.240693  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:52.740381  414292 type.go:168] "Request Body" body=""
	I1217 20:26:52.740464  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:52.740824  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:53.240545  414292 type.go:168] "Request Body" body=""
	I1217 20:26:53.240622  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:53.240967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:53.241022  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:53.740774  414292 type.go:168] "Request Body" body=""
	I1217 20:26:53.740855  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:53.741192  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:54.240965  414292 type.go:168] "Request Body" body=""
	I1217 20:26:54.241045  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:54.241396  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:54.740180  414292 type.go:168] "Request Body" body=""
	I1217 20:26:54.740269  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:54.740570  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:55.240136  414292 type.go:168] "Request Body" body=""
	I1217 20:26:55.240208  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:55.240531  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:55.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:55.740305  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:55.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:55.740689  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:56.240227  414292 type.go:168] "Request Body" body=""
	I1217 20:26:56.240326  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:56.240664  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:56.740128  414292 type.go:168] "Request Body" body=""
	I1217 20:26:56.740207  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:56.740534  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:57.031083  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:57.091368  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:57.091412  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:57.091434  414292 retry.go:31] will retry after 46.71155063s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:57.240719  414292 type.go:168] "Request Body" body=""
	I1217 20:26:57.240799  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:57.241113  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:57.740782  414292 type.go:168] "Request Body" body=""
	I1217 20:26:57.740862  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:57.741143  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:57.741191  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:58.240610  414292 type.go:168] "Request Body" body=""
	I1217 20:26:58.240678  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:58.240925  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:58.740701  414292 type.go:168] "Request Body" body=""
	I1217 20:26:58.740774  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:58.741126  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:59.241090  414292 type.go:168] "Request Body" body=""
	I1217 20:26:59.241163  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:59.241466  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:59.740862  414292 type.go:168] "Request Body" body=""
	I1217 20:26:59.740930  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:59.741174  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:59.741215  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:00.241177  414292 type.go:168] "Request Body" body=""
	I1217 20:27:00.241266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:00.241643  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:00.740463  414292 type.go:168] "Request Body" body=""
	I1217 20:27:00.740543  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:00.740888  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:01.240688  414292 type.go:168] "Request Body" body=""
	I1217 20:27:01.240764  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:01.241063  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:01.740881  414292 type.go:168] "Request Body" body=""
	I1217 20:27:01.740989  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:01.741337  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:01.741388  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:02.240100  414292 type.go:168] "Request Body" body=""
	I1217 20:27:02.240176  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:02.240556  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:02.557038  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:27:02.616976  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:02.620493  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:27:02.620531  414292 retry.go:31] will retry after 42.622456402s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:27:02.740802  414292 type.go:168] "Request Body" body=""
	I1217 20:27:02.740875  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:02.741140  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:03.240977  414292 type.go:168] "Request Body" body=""
	I1217 20:27:03.241074  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:03.241392  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:03.740139  414292 type.go:168] "Request Body" body=""
	I1217 20:27:03.740236  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:03.740586  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:04.240156  414292 type.go:168] "Request Body" body=""
	I1217 20:27:04.240238  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:04.240536  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:04.240579  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:04.740272  414292 type.go:168] "Request Body" body=""
	I1217 20:27:04.740346  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:04.740738  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:05.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:27:05.240287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:05.240617  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:05.740277  414292 type.go:168] "Request Body" body=""
	I1217 20:27:05.740351  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:05.740613  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:06.240183  414292 type.go:168] "Request Body" body=""
	I1217 20:27:06.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:06.240615  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:06.240675  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:06.740224  414292 type.go:168] "Request Body" body=""
	I1217 20:27:06.740355  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:06.740779  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:07.240397  414292 type.go:168] "Request Body" body=""
	I1217 20:27:07.240474  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:07.240744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:07.740218  414292 type.go:168] "Request Body" body=""
	I1217 20:27:07.740311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:07.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:08.240435  414292 type.go:168] "Request Body" body=""
	I1217 20:27:08.240511  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:08.240866  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:08.240920  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:08.740675  414292 type.go:168] "Request Body" body=""
	I1217 20:27:08.740748  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:08.741014  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:09.241042  414292 type.go:168] "Request Body" body=""
	I1217 20:27:09.241128  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:09.241481  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:09.740097  414292 type.go:168] "Request Body" body=""
	I1217 20:27:09.740192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:09.740525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:10.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:27:10.240295  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:10.240559  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:10.740356  414292 type.go:168] "Request Body" body=""
	I1217 20:27:10.740433  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:10.740782  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:10.740855  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:11.240590  414292 type.go:168] "Request Body" body=""
	I1217 20:27:11.240674  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:11.241071  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:11.740833  414292 type.go:168] "Request Body" body=""
	I1217 20:27:11.740915  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:11.741195  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:12.241068  414292 type.go:168] "Request Body" body=""
	I1217 20:27:12.241147  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:12.241476  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:12.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:27:12.740301  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:12.740663  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:13.240124  414292 type.go:168] "Request Body" body=""
	I1217 20:27:13.240197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:13.240538  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:13.240601  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:13.740193  414292 type.go:168] "Request Body" body=""
	I1217 20:27:13.740291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:13.740596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:14.240723  414292 type.go:168] "Request Body" body=""
	I1217 20:27:14.240797  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:14.241150  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:14.740979  414292 type.go:168] "Request Body" body=""
	I1217 20:27:14.741059  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:14.741325  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:15.240063  414292 type.go:168] "Request Body" body=""
	I1217 20:27:15.240146  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:15.240479  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:15.740243  414292 type.go:168] "Request Body" body=""
	I1217 20:27:15.740346  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:15.740681  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:15.740748  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:16.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:27:16.240421  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:16.240686  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:16.740158  414292 type.go:168] "Request Body" body=""
	I1217 20:27:16.740236  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:16.740588  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:17.240299  414292 type.go:168] "Request Body" body=""
	I1217 20:27:17.240374  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:17.240705  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:17.740352  414292 type.go:168] "Request Body" body=""
	I1217 20:27:17.740427  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:17.740687  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:18.240626  414292 type.go:168] "Request Body" body=""
	I1217 20:27:18.240717  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:18.241052  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:18.241112  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:18.740882  414292 type.go:168] "Request Body" body=""
	I1217 20:27:18.740963  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:18.741275  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:19.240070  414292 type.go:168] "Request Body" body=""
	I1217 20:27:19.240154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:19.240424  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:19.740179  414292 type.go:168] "Request Body" body=""
	I1217 20:27:19.740319  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:19.740655  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:20.240358  414292 type.go:168] "Request Body" body=""
	I1217 20:27:20.240437  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:20.240772  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:20.740370  414292 type.go:168] "Request Body" body=""
	I1217 20:27:20.740436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:20.740701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:20.740740  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:21.240460  414292 type.go:168] "Request Body" body=""
	I1217 20:27:21.240550  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:21.240867  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:21.740170  414292 type.go:168] "Request Body" body=""
	I1217 20:27:21.740290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:21.740607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:22.240286  414292 type.go:168] "Request Body" body=""
	I1217 20:27:22.240355  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:22.240607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:22.740330  414292 type.go:168] "Request Body" body=""
	I1217 20:27:22.740422  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:22.740746  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:22.740812  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:23.240211  414292 type.go:168] "Request Body" body=""
	I1217 20:27:23.240304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:23.240668  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:23.740210  414292 type.go:168] "Request Body" body=""
	I1217 20:27:23.740305  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:23.740601  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:24.240106  414292 type.go:168] "Request Body" body=""
	I1217 20:27:24.240203  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:24.240577  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:24.740327  414292 type.go:168] "Request Body" body=""
	I1217 20:27:24.740411  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:24.740719  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:25.240377  414292 type.go:168] "Request Body" body=""
	I1217 20:27:25.240459  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:25.240721  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:25.240761  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:25.740193  414292 type.go:168] "Request Body" body=""
	I1217 20:27:25.740294  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:25.740646  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:26.240192  414292 type.go:168] "Request Body" body=""
	I1217 20:27:26.240307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:26.240648  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:26.741005  414292 type.go:168] "Request Body" body=""
	I1217 20:27:26.741084  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:26.741343  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:27.241172  414292 type.go:168] "Request Body" body=""
	I1217 20:27:27.241259  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:27.241612  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:27.241675  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:27.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:27:27.740292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:27.740610  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:28.240549  414292 type.go:168] "Request Body" body=""
	I1217 20:27:28.240616  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:28.240862  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:28.740178  414292 type.go:168] "Request Body" body=""
	I1217 20:27:28.740278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:28.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:29.240202  414292 type.go:168] "Request Body" body=""
	I1217 20:27:29.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:29.240606  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:29.740117  414292 type.go:168] "Request Body" body=""
	I1217 20:27:29.740197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:29.740469  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:29.740520  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:30.240204  414292 type.go:168] "Request Body" body=""
	I1217 20:27:30.240310  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:30.240594  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:30.740299  414292 type.go:168] "Request Body" body=""
	I1217 20:27:30.740381  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:30.740724  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:31.240991  414292 type.go:168] "Request Body" body=""
	I1217 20:27:31.241062  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:31.241311  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:31.741040  414292 type.go:168] "Request Body" body=""
	I1217 20:27:31.741119  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:31.741431  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:31.741486  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:32.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:27:32.240308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:32.240623  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:32.740134  414292 type.go:168] "Request Body" body=""
	I1217 20:27:32.740210  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:32.740528  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:33.240226  414292 type.go:168] "Request Body" body=""
	I1217 20:27:33.240327  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:33.240647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:33.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:27:33.740286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:33.740611  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:34.240530  414292 type.go:168] "Request Body" body=""
	I1217 20:27:34.240599  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:34.240875  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:34.240919  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:34.740753  414292 type.go:168] "Request Body" body=""
	I1217 20:27:34.740829  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:34.741167  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:35.240992  414292 type.go:168] "Request Body" body=""
	I1217 20:27:35.241075  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:35.241368  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:35.740071  414292 type.go:168] "Request Body" body=""
	I1217 20:27:35.740149  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:35.740453  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:36.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:27:36.240282  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:36.240595  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:36.740308  414292 type.go:168] "Request Body" body=""
	I1217 20:27:36.740384  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:36.740734  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:36.740791  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:37.240354  414292 type.go:168] "Request Body" body=""
	I1217 20:27:37.240426  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:37.240679  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:37.740203  414292 type.go:168] "Request Body" body=""
	I1217 20:27:37.740290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:37.740605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:38.240504  414292 type.go:168] "Request Body" body=""
	I1217 20:27:38.240579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:38.240898  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:38.740390  414292 type.go:168] "Request Body" body=""
	I1217 20:27:38.740487  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:38.740891  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:38.740956  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:39.240903  414292 type.go:168] "Request Body" body=""
	I1217 20:27:39.240984  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:39.241256  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:39.741008  414292 type.go:168] "Request Body" body=""
	I1217 20:27:39.741080  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:39.741404  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:40.241008  414292 type.go:168] "Request Body" body=""
	I1217 20:27:40.241078  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:40.241381  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:40.740111  414292 type.go:168] "Request Body" body=""
	I1217 20:27:40.740212  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:40.740522  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:41.240166  414292 type.go:168] "Request Body" body=""
	I1217 20:27:41.240266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:41.240622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:41.240677  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:41.740082  414292 type.go:168] "Request Body" body=""
	I1217 20:27:41.740154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:41.740465  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:42.240219  414292 type.go:168] "Request Body" body=""
	I1217 20:27:42.240342  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:42.240648  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:42.740376  414292 type.go:168] "Request Body" body=""
	I1217 20:27:42.740469  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:42.740797  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:43.240389  414292 type.go:168] "Request Body" body=""
	I1217 20:27:43.240470  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:43.240795  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:43.240842  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:43.740348  414292 type.go:168] "Request Body" body=""
	I1217 20:27:43.740422  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:43.740975  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:43.803299  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:27:43.859759  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:43.863204  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:43.863297  414292 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1217 20:27:44.241025  414292 type.go:168] "Request Body" body=""
	I1217 20:27:44.241121  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:44.241455  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:44.740148  414292 type.go:168] "Request Body" body=""
	I1217 20:27:44.740221  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:44.740492  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:45.240574  414292 type.go:168] "Request Body" body=""
	I1217 20:27:45.240742  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:45.242019  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=1
	W1217 20:27:45.242186  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:45.244153  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:27:45.319007  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:45.319122  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:45.319226  414292 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1217 20:27:45.322350  414292 out.go:179] * Enabled addons: 
	I1217 20:27:45.325857  414292 addons.go:530] duration metric: took 1m57.123269017s for enable addons: enabled=[]
	I1217 20:27:45.740623  414292 type.go:168] "Request Body" body=""
	I1217 20:27:45.740707  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:45.741048  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:46.240887  414292 type.go:168] "Request Body" body=""
	I1217 20:27:46.240956  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:46.241257  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:46.741061  414292 type.go:168] "Request Body" body=""
	I1217 20:27:46.741135  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:46.741496  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:47.240179  414292 type.go:168] "Request Body" body=""
	I1217 20:27:47.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:47.240598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:47.740172  414292 type.go:168] "Request Body" body=""
	I1217 20:27:47.745570  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:47.746779  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:47.746914  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:48.240563  414292 type.go:168] "Request Body" body=""
	I1217 20:27:48.240642  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:48.240990  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:48.740796  414292 type.go:168] "Request Body" body=""
	I1217 20:27:48.740881  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:48.741219  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:49.240326  414292 type.go:168] "Request Body" body=""
	I1217 20:27:49.240395  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:49.240661  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:49.740181  414292 type.go:168] "Request Body" body=""
	I1217 20:27:49.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:49.740594  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:50.240191  414292 type.go:168] "Request Body" body=""
	I1217 20:27:50.240286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:50.240605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:50.240662  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:50.741072  414292 type.go:168] "Request Body" body=""
	I1217 20:27:50.741139  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:50.741398  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:51.240102  414292 type.go:168] "Request Body" body=""
	I1217 20:27:51.240183  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:51.240525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:51.740138  414292 type.go:168] "Request Body" body=""
	I1217 20:27:51.740212  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:51.740573  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:52.240123  414292 type.go:168] "Request Body" body=""
	I1217 20:27:52.240196  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:52.240556  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:52.740167  414292 type.go:168] "Request Body" body=""
	I1217 20:27:52.740295  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:52.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:52.740683  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:53.240363  414292 type.go:168] "Request Body" body=""
	I1217 20:27:53.240445  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:53.240776  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:53.740345  414292 type.go:168] "Request Body" body=""
	I1217 20:27:53.740444  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:53.740732  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:54.240671  414292 type.go:168] "Request Body" body=""
	I1217 20:27:54.240743  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:54.241055  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:54.740834  414292 type.go:168] "Request Body" body=""
	I1217 20:27:54.740911  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:54.741241  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:54.741301  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:55.241020  414292 type.go:168] "Request Body" body=""
	I1217 20:27:55.241088  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:55.241340  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:55.740115  414292 type.go:168] "Request Body" body=""
	I1217 20:27:55.740205  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:55.740573  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:56.240196  414292 type.go:168] "Request Body" body=""
	I1217 20:27:56.240298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:56.240618  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:56.740302  414292 type.go:168] "Request Body" body=""
	I1217 20:27:56.740375  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:56.740674  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:57.240163  414292 type.go:168] "Request Body" body=""
	I1217 20:27:57.240242  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:57.240572  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:57.240625  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:57.740326  414292 type.go:168] "Request Body" body=""
	I1217 20:27:57.740414  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:57.740758  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:58.240606  414292 type.go:168] "Request Body" body=""
	I1217 20:27:58.240676  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:58.240930  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:58.740703  414292 type.go:168] "Request Body" body=""
	I1217 20:27:58.740776  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:58.741128  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:59.240930  414292 type.go:168] "Request Body" body=""
	I1217 20:27:59.241008  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:59.241330  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:59.241390  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:59.741097  414292 type.go:168] "Request Body" body=""
	I1217 20:27:59.741170  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:59.741444  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:00.240277  414292 type.go:168] "Request Body" body=""
	I1217 20:28:00.240360  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:00.240691  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:00.740372  414292 type.go:168] "Request Body" body=""
	I1217 20:28:00.740453  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:00.740814  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:01.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:28:01.240448  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:01.240754  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:01.740452  414292 type.go:168] "Request Body" body=""
	I1217 20:28:01.740539  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:01.740931  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:01.740984  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:02.240802  414292 type.go:168] "Request Body" body=""
	I1217 20:28:02.240878  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:02.241186  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:02.740889  414292 type.go:168] "Request Body" body=""
	I1217 20:28:02.740958  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:02.741285  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:03.241130  414292 type.go:168] "Request Body" body=""
	I1217 20:28:03.241210  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:03.241568  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:03.740205  414292 type.go:168] "Request Body" body=""
	I1217 20:28:03.740305  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:03.740675  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:04.240435  414292 type.go:168] "Request Body" body=""
	I1217 20:28:04.240501  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:04.240759  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:04.240799  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:04.740206  414292 type.go:168] "Request Body" body=""
	I1217 20:28:04.740298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:04.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:05.240365  414292 type.go:168] "Request Body" body=""
	I1217 20:28:05.240442  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:05.240770  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:05.740360  414292 type.go:168] "Request Body" body=""
	I1217 20:28:05.740435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:05.740725  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:06.240330  414292 type.go:168] "Request Body" body=""
	I1217 20:28:06.240409  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:06.240732  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:06.740442  414292 type.go:168] "Request Body" body=""
	I1217 20:28:06.740525  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:06.740853  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:06.740914  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:07.240349  414292 type.go:168] "Request Body" body=""
	I1217 20:28:07.240423  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:07.240678  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:07.740162  414292 type.go:168] "Request Body" body=""
	I1217 20:28:07.740243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:07.740585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:08.240506  414292 type.go:168] "Request Body" body=""
	I1217 20:28:08.240587  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:08.240906  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:08.740708  414292 type.go:168] "Request Body" body=""
	I1217 20:28:08.740811  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:08.741159  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:08.741223  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:09.241091  414292 type.go:168] "Request Body" body=""
	I1217 20:28:09.241169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:09.241495  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:09.740346  414292 type.go:168] "Request Body" body=""
	I1217 20:28:09.740424  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:09.740766  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:10.240377  414292 type.go:168] "Request Body" body=""
	I1217 20:28:10.240453  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:10.240750  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:10.740215  414292 type.go:168] "Request Body" body=""
	I1217 20:28:10.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:10.740678  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:11.240243  414292 type.go:168] "Request Body" body=""
	I1217 20:28:11.240342  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:11.240712  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:11.240767  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:11.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:28:11.740446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:11.740716  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:12.240420  414292 type.go:168] "Request Body" body=""
	I1217 20:28:12.240502  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:12.240833  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:12.740560  414292 type.go:168] "Request Body" body=""
	I1217 20:28:12.740634  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:12.740952  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:13.240720  414292 type.go:168] "Request Body" body=""
	I1217 20:28:13.240805  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:13.241066  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:13.241122  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:13.740900  414292 type.go:168] "Request Body" body=""
	I1217 20:28:13.740983  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:13.741356  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:14.240146  414292 type.go:168] "Request Body" body=""
	I1217 20:28:14.240277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:14.240625  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:14.740140  414292 type.go:168] "Request Body" body=""
	I1217 20:28:14.740209  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:14.740516  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:15.240203  414292 type.go:168] "Request Body" body=""
	I1217 20:28:15.240304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:15.240633  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:15.740361  414292 type.go:168] "Request Body" body=""
	I1217 20:28:15.740447  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:15.740780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:15.740840  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:16.240359  414292 type.go:168] "Request Body" body=""
	I1217 20:28:16.240436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:16.240823  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:16.740505  414292 type.go:168] "Request Body" body=""
	I1217 20:28:16.740583  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:16.740911  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:17.240693  414292 type.go:168] "Request Body" body=""
	I1217 20:28:17.240772  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:17.241095  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:17.740839  414292 type.go:168] "Request Body" body=""
	I1217 20:28:17.740925  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:17.741192  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:17.741241  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:18.241106  414292 type.go:168] "Request Body" body=""
	I1217 20:28:18.241186  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:18.241520  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:18.740223  414292 type.go:168] "Request Body" body=""
	I1217 20:28:18.740344  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:18.740693  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:19.240508  414292 type.go:168] "Request Body" body=""
	I1217 20:28:19.240581  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:19.240841  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:19.740191  414292 type.go:168] "Request Body" body=""
	I1217 20:28:19.740291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:19.740610  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:20.240363  414292 type.go:168] "Request Body" body=""
	I1217 20:28:20.240438  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:20.240765  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:20.240823  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:20.740348  414292 type.go:168] "Request Body" body=""
	I1217 20:28:20.740426  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:20.740691  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:21.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:28:21.240298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:21.240640  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:21.740375  414292 type.go:168] "Request Body" body=""
	I1217 20:28:21.740465  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:21.740819  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:22.240416  414292 type.go:168] "Request Body" body=""
	I1217 20:28:22.240484  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:22.240741  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:22.740166  414292 type.go:168] "Request Body" body=""
	I1217 20:28:22.740262  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:22.740592  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:22.740654  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:23.240360  414292 type.go:168] "Request Body" body=""
	I1217 20:28:23.240441  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:23.240798  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:23.740367  414292 type.go:168] "Request Body" body=""
	I1217 20:28:23.740434  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:23.740759  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:24.240649  414292 type.go:168] "Request Body" body=""
	I1217 20:28:24.240730  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:24.241071  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:24.740869  414292 type.go:168] "Request Body" body=""
	I1217 20:28:24.740949  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:24.741289  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:24.741351  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:25.241053  414292 type.go:168] "Request Body" body=""
	I1217 20:28:25.241120  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:25.241378  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:25.740052  414292 type.go:168] "Request Body" body=""
	I1217 20:28:25.740126  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:25.740478  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:26.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:28:26.240294  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:26.240602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:26.740121  414292 type.go:168] "Request Body" body=""
	I1217 20:28:26.740197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:26.740507  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:27.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:28:27.240284  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:27.240622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:27.240679  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:27.740368  414292 type.go:168] "Request Body" body=""
	I1217 20:28:27.740447  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:27.740790  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:28.240689  414292 type.go:168] "Request Body" body=""
	I1217 20:28:28.240769  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:28.241066  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:28.740814  414292 type.go:168] "Request Body" body=""
	I1217 20:28:28.740891  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:28.741191  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:29.240973  414292 type.go:168] "Request Body" body=""
	I1217 20:28:29.241045  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:29.241401  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:29.241453  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:29.740076  414292 type.go:168] "Request Body" body=""
	I1217 20:28:29.740154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:29.740474  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:30.240176  414292 type.go:168] "Request Body" body=""
	I1217 20:28:30.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:30.240673  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:30.740407  414292 type.go:168] "Request Body" body=""
	I1217 20:28:30.740484  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:30.740834  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:31.240344  414292 type.go:168] "Request Body" body=""
	I1217 20:28:31.240421  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:31.240680  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:31.740183  414292 type.go:168] "Request Body" body=""
	I1217 20:28:31.740278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:31.740615  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:31.740674  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:32.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:28:32.240291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:32.240620  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:32.740110  414292 type.go:168] "Request Body" body=""
	I1217 20:28:32.740177  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:32.740450  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:33.240131  414292 type.go:168] "Request Body" body=""
	I1217 20:28:33.240205  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:33.240522  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:33.740095  414292 type.go:168] "Request Body" body=""
	I1217 20:28:33.740169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:33.740516  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:34.240083  414292 type.go:168] "Request Body" body=""
	I1217 20:28:34.240169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:34.240471  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:34.240522  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:34.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:28:34.740277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:34.740604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:35.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:28:35.240295  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:35.240596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:35.740083  414292 type.go:168] "Request Body" body=""
	I1217 20:28:35.740163  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:35.740501  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:36.240147  414292 type.go:168] "Request Body" body=""
	I1217 20:28:36.240220  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:36.240565  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:36.240620  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:36.740181  414292 type.go:168] "Request Body" body=""
	I1217 20:28:36.740270  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:36.740569  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:37.240126  414292 type.go:168] "Request Body" body=""
	I1217 20:28:37.240206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:37.240510  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:37.740120  414292 type.go:168] "Request Body" body=""
	I1217 20:28:37.740203  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:37.740533  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:38.240393  414292 type.go:168] "Request Body" body=""
	I1217 20:28:38.240473  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:38.240804  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:38.240859  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:38.740418  414292 type.go:168] "Request Body" body=""
	I1217 20:28:38.740493  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:38.740792  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:39.240679  414292 type.go:168] "Request Body" body=""
	I1217 20:28:39.240778  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:39.241109  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:39.740934  414292 type.go:168] "Request Body" body=""
	I1217 20:28:39.741010  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:39.741373  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:40.240097  414292 type.go:168] "Request Body" body=""
	I1217 20:28:40.240172  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:40.240452  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:40.740149  414292 type.go:168] "Request Body" body=""
	I1217 20:28:40.740221  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:40.740580  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:40.740634  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:41.240170  414292 type.go:168] "Request Body" body=""
	I1217 20:28:41.240273  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:41.240600  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:41.740062  414292 type.go:168] "Request Body" body=""
	I1217 20:28:41.740137  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:41.740430  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:42.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:28:42.240290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:42.240668  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:42.740377  414292 type.go:168] "Request Body" body=""
	I1217 20:28:42.740455  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:42.740779  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:42.740831  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:43.240367  414292 type.go:168] "Request Body" body=""
	I1217 20:28:43.240476  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:43.240764  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:43.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:28:43.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:43.740560  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:44.240459  414292 type.go:168] "Request Body" body=""
	I1217 20:28:44.240534  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:44.240870  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:44.740354  414292 type.go:168] "Request Body" body=""
	I1217 20:28:44.740429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:44.740695  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:45.240439  414292 type.go:168] "Request Body" body=""
	I1217 20:28:45.240568  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:45.241041  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:45.241102  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:45.740871  414292 type.go:168] "Request Body" body=""
	I1217 20:28:45.740956  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:45.741304  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:46.241069  414292 type.go:168] "Request Body" body=""
	I1217 20:28:46.241138  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:46.241455  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:46.740186  414292 type.go:168] "Request Body" body=""
	I1217 20:28:46.740277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:46.740602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:47.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:28:47.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:47.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:47.740189  414292 type.go:168] "Request Body" body=""
	I1217 20:28:47.740282  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:47.740601  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:47.740657  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:48.240600  414292 type.go:168] "Request Body" body=""
	I1217 20:28:48.240678  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:48.241014  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:48.740811  414292 type.go:168] "Request Body" body=""
	I1217 20:28:48.740888  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:48.741185  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:49.240973  414292 type.go:168] "Request Body" body=""
	I1217 20:28:49.241051  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:49.241327  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:49.741121  414292 type.go:168] "Request Body" body=""
	I1217 20:28:49.741215  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:49.741596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:49.741649  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:50.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:28:50.240243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:50.240599  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:50.740121  414292 type.go:168] "Request Body" body=""
	I1217 20:28:50.740197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:50.740508  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:51.240167  414292 type.go:168] "Request Body" body=""
	I1217 20:28:51.240239  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:51.240596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:51.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:28:51.740310  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:51.740686  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:52.240373  414292 type.go:168] "Request Body" body=""
	I1217 20:28:52.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:52.240709  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:52.240761  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:52.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:28:52.740289  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:52.740584  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:53.240185  414292 type.go:168] "Request Body" body=""
	I1217 20:28:53.240278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:53.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:53.740309  414292 type.go:168] "Request Body" body=""
	I1217 20:28:53.740380  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:53.740678  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:54.240655  414292 type.go:168] "Request Body" body=""
	I1217 20:28:54.240729  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:54.241066  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:54.241123  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:54.740804  414292 type.go:168] "Request Body" body=""
	I1217 20:28:54.740895  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:54.741266  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:55.241066  414292 type.go:168] "Request Body" body=""
	I1217 20:28:55.241142  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:55.241410  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:55.740129  414292 type.go:168] "Request Body" body=""
	I1217 20:28:55.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:55.740615  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:56.240178  414292 type.go:168] "Request Body" body=""
	I1217 20:28:56.240274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:56.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:56.740206  414292 type.go:168] "Request Body" body=""
	I1217 20:28:56.740289  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:56.740613  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:56.740664  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:57.240172  414292 type.go:168] "Request Body" body=""
	I1217 20:28:57.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:57.240598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:57.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:28:57.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:57.740644  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:58.240496  414292 type.go:168] "Request Body" body=""
	I1217 20:28:58.240566  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:58.240825  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:58.740558  414292 type.go:168] "Request Body" body=""
	I1217 20:28:58.740638  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:58.740975  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:58.741026  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:59.240760  414292 type.go:168] "Request Body" body=""
	I1217 20:28:59.240835  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:59.241171  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:59.740992  414292 type.go:168] "Request Body" body=""
	I1217 20:28:59.741067  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:59.741325  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:00.241179  414292 type.go:168] "Request Body" body=""
	I1217 20:29:00.241274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:00.241594  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:00.740545  414292 type.go:168] "Request Body" body=""
	I1217 20:29:00.740619  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:00.740922  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:01.240654  414292 type.go:168] "Request Body" body=""
	I1217 20:29:01.240729  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:01.241023  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:01.241072  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:01.740787  414292 type.go:168] "Request Body" body=""
	I1217 20:29:01.740865  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:01.741183  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:02.240971  414292 type.go:168] "Request Body" body=""
	I1217 20:29:02.241051  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:02.241393  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:02.740980  414292 type.go:168] "Request Body" body=""
	I1217 20:29:02.741059  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:02.741391  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:03.240071  414292 type.go:168] "Request Body" body=""
	I1217 20:29:03.240147  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:03.240491  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:03.740095  414292 type.go:168] "Request Body" body=""
	I1217 20:29:03.740172  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:03.740538  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:03.740593  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:04.240122  414292 type.go:168] "Request Body" body=""
	I1217 20:29:04.240206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:04.240574  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:04.740282  414292 type.go:168] "Request Body" body=""
	I1217 20:29:04.740362  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:04.740712  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:05.240168  414292 type.go:168] "Request Body" body=""
	I1217 20:29:05.240264  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:05.240599  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:05.740352  414292 type.go:168] "Request Body" body=""
	I1217 20:29:05.740431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:05.740697  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:05.740738  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:06.240370  414292 type.go:168] "Request Body" body=""
	I1217 20:29:06.240445  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:06.240788  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:06.741116  414292 type.go:168] "Request Body" body=""
	I1217 20:29:06.741191  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:06.741539  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:07.240080  414292 type.go:168] "Request Body" body=""
	I1217 20:29:07.240155  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:07.240463  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:07.740201  414292 type.go:168] "Request Body" body=""
	I1217 20:29:07.740303  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:07.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:08.240575  414292 type.go:168] "Request Body" body=""
	I1217 20:29:08.240648  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:08.241002  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:08.241060  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:08.740757  414292 type.go:168] "Request Body" body=""
	I1217 20:29:08.740826  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:08.741089  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:09.241059  414292 type.go:168] "Request Body" body=""
	I1217 20:29:09.241165  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:09.241510  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:09.740220  414292 type.go:168] "Request Body" body=""
	I1217 20:29:09.740308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:09.740656  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:10.240344  414292 type.go:168] "Request Body" body=""
	I1217 20:29:10.240444  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:10.240756  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:10.740205  414292 type.go:168] "Request Body" body=""
	I1217 20:29:10.740293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:10.740629  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:10.740685  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:11.240376  414292 type.go:168] "Request Body" body=""
	I1217 20:29:11.240454  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:11.240798  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:11.740377  414292 type.go:168] "Request Body" body=""
	I1217 20:29:11.740475  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:11.740809  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:12.240548  414292 type.go:168] "Request Body" body=""
	I1217 20:29:12.240621  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:12.240958  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:12.740753  414292 type.go:168] "Request Body" body=""
	I1217 20:29:12.740831  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:12.741131  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:12.741180  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:13.240743  414292 type.go:168] "Request Body" body=""
	I1217 20:29:13.240816  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:13.241082  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:13.740877  414292 type.go:168] "Request Body" body=""
	I1217 20:29:13.740957  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:13.741251  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:14.241067  414292 type.go:168] "Request Body" body=""
	I1217 20:29:14.241141  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:14.241456  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:14.740157  414292 type.go:168] "Request Body" body=""
	I1217 20:29:14.740261  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:14.740657  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:15.240363  414292 type.go:168] "Request Body" body=""
	I1217 20:29:15.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:15.240796  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:15.240849  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:15.740520  414292 type.go:168] "Request Body" body=""
	I1217 20:29:15.740600  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:15.740926  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:16.240670  414292 type.go:168] "Request Body" body=""
	I1217 20:29:16.240741  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:16.241005  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:16.740812  414292 type.go:168] "Request Body" body=""
	I1217 20:29:16.740895  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:16.741250  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:17.241030  414292 type.go:168] "Request Body" body=""
	I1217 20:29:17.241104  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:17.241485  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:17.241544  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:17.741007  414292 type.go:168] "Request Body" body=""
	I1217 20:29:17.741075  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:17.741330  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:18.240351  414292 type.go:168] "Request Body" body=""
	I1217 20:29:18.240431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:18.240777  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:18.740148  414292 type.go:168] "Request Body" body=""
	I1217 20:29:18.740233  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:18.740593  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:19.240118  414292 type.go:168] "Request Body" body=""
	I1217 20:29:19.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:19.240527  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:19.740167  414292 type.go:168] "Request Body" body=""
	I1217 20:29:19.740241  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:19.740598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:19.740652  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:20.240357  414292 type.go:168] "Request Body" body=""
	I1217 20:29:20.240435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:20.240764  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:20.740360  414292 type.go:168] "Request Body" body=""
	I1217 20:29:20.740433  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:20.740702  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:21.240449  414292 type.go:168] "Request Body" body=""
	I1217 20:29:21.240532  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:21.240864  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:21.740699  414292 type.go:168] "Request Body" body=""
	I1217 20:29:21.740783  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:21.741119  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:21.741177  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:22.240878  414292 type.go:168] "Request Body" body=""
	I1217 20:29:22.240947  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:22.241200  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:22.740985  414292 type.go:168] "Request Body" body=""
	I1217 20:29:22.741058  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:22.741409  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:23.240121  414292 type.go:168] "Request Body" body=""
	I1217 20:29:23.240195  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:23.240546  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:23.740154  414292 type.go:168] "Request Body" body=""
	I1217 20:29:23.740239  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:23.740563  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:24.240534  414292 type.go:168] "Request Body" body=""
	I1217 20:29:24.240612  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:24.240947  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:24.241000  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:24.740777  414292 type.go:168] "Request Body" body=""
	I1217 20:29:24.740857  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:24.741204  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:25.240998  414292 type.go:168] "Request Body" body=""
	I1217 20:29:25.241066  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:25.241333  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:25.741123  414292 type.go:168] "Request Body" body=""
	I1217 20:29:25.741202  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:25.741530  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:26.240201  414292 type.go:168] "Request Body" body=""
	I1217 20:29:26.240299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:26.240642  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:26.740361  414292 type.go:168] "Request Body" body=""
	I1217 20:29:26.740432  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:26.740744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:26.740792  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:27.240174  414292 type.go:168] "Request Body" body=""
	I1217 20:29:27.240243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:27.240585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:27.740308  414292 type.go:168] "Request Body" body=""
	I1217 20:29:27.740392  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:27.740767  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:28.240517  414292 type.go:168] "Request Body" body=""
	I1217 20:29:28.240588  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:28.240857  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:28.740679  414292 type.go:168] "Request Body" body=""
	I1217 20:29:28.740752  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:28.741120  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:28.741173  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:29.240914  414292 type.go:168] "Request Body" body=""
	I1217 20:29:29.240995  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:29.241335  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:29.740969  414292 type.go:168] "Request Body" body=""
	I1217 20:29:29.741045  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:29.741327  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:30.240069  414292 type.go:168] "Request Body" body=""
	I1217 20:29:30.240150  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:30.240512  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:30.740197  414292 type.go:168] "Request Body" body=""
	I1217 20:29:30.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:30.740646  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:31.240121  414292 type.go:168] "Request Body" body=""
	I1217 20:29:31.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:31.240521  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:31.240571  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:31.740162  414292 type.go:168] "Request Body" body=""
	I1217 20:29:31.740238  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:31.740576  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:32.240165  414292 type.go:168] "Request Body" body=""
	I1217 20:29:32.240262  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:32.240572  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:32.740149  414292 type.go:168] "Request Body" body=""
	I1217 20:29:32.740217  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:32.740555  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:33.240266  414292 type.go:168] "Request Body" body=""
	I1217 20:29:33.240342  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:33.240665  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:33.240725  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:33.740182  414292 type.go:168] "Request Body" body=""
	I1217 20:29:33.740280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:33.740619  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:34.240111  414292 type.go:168] "Request Body" body=""
	I1217 20:29:34.240178  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:34.240456  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:34.740169  414292 type.go:168] "Request Body" body=""
	I1217 20:29:34.740275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:34.740676  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:35.240238  414292 type.go:168] "Request Body" body=""
	I1217 20:29:35.240333  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:35.240684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:35.740366  414292 type.go:168] "Request Body" body=""
	I1217 20:29:35.740431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:35.740683  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:35.740724  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:36.240205  414292 type.go:168] "Request Body" body=""
	I1217 20:29:36.240297  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:36.240641  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:36.740372  414292 type.go:168] "Request Body" body=""
	I1217 20:29:36.740448  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:36.740761  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:37.240379  414292 type.go:168] "Request Body" body=""
	I1217 20:29:37.240448  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:37.240766  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:37.740215  414292 type.go:168] "Request Body" body=""
	I1217 20:29:37.740301  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:37.740614  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:38.240590  414292 type.go:168] "Request Body" body=""
	I1217 20:29:38.240670  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:38.241007  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:38.241051  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:38.740806  414292 type.go:168] "Request Body" body=""
	I1217 20:29:38.740880  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:38.741145  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:39.240077  414292 type.go:168] "Request Body" body=""
	I1217 20:29:39.240158  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:39.240533  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:39.740138  414292 type.go:168] "Request Body" body=""
	I1217 20:29:39.740216  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:39.740575  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:40.240271  414292 type.go:168] "Request Body" body=""
	I1217 20:29:40.240352  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:40.240630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:40.740330  414292 type.go:168] "Request Body" body=""
	I1217 20:29:40.740414  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:40.740751  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:40.740807  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:41.240185  414292 type.go:168] "Request Body" body=""
	I1217 20:29:41.240278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:41.240603  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:41.740073  414292 type.go:168] "Request Body" body=""
	I1217 20:29:41.740152  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:41.740438  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:42.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:29:42.240310  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:42.240671  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:42.740385  414292 type.go:168] "Request Body" body=""
	I1217 20:29:42.740463  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:42.740790  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:42.740846  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:43.240351  414292 type.go:168] "Request Body" body=""
	I1217 20:29:43.240416  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:43.240662  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:43.740171  414292 type.go:168] "Request Body" body=""
	I1217 20:29:43.740242  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:43.740569  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:44.240470  414292 type.go:168] "Request Body" body=""
	I1217 20:29:44.240555  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:44.241028  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:44.740766  414292 type.go:168] "Request Body" body=""
	I1217 20:29:44.740836  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:44.741107  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:44.741148  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:45.241078  414292 type.go:168] "Request Body" body=""
	I1217 20:29:45.241164  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:45.241604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:45.740201  414292 type.go:168] "Request Body" body=""
	I1217 20:29:45.740298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:45.740651  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:46.240352  414292 type.go:168] "Request Body" body=""
	I1217 20:29:46.240425  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:46.240697  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:46.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:29:46.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:46.740656  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:47.240376  414292 type.go:168] "Request Body" body=""
	I1217 20:29:47.240462  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:47.240824  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:47.240891  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:47.740486  414292 type.go:168] "Request Body" body=""
	I1217 20:29:47.740555  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:47.740822  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:48.240800  414292 type.go:168] "Request Body" body=""
	I1217 20:29:48.240885  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:48.241255  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:48.741124  414292 type.go:168] "Request Body" body=""
	I1217 20:29:48.741203  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:48.741664  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:49.240445  414292 type.go:168] "Request Body" body=""
	I1217 20:29:49.240522  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:49.240794  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:49.743007  414292 type.go:168] "Request Body" body=""
	I1217 20:29:49.743084  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:49.743421  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:49.743497  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:50.241133  414292 type.go:168] "Request Body" body=""
	I1217 20:29:50.241229  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:50.241648  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:50.740153  414292 type.go:168] "Request Body" body=""
	I1217 20:29:50.740228  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:50.740573  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:51.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:29:51.240301  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:51.240639  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:51.740398  414292 type.go:168] "Request Body" body=""
	I1217 20:29:51.740482  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:51.740812  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:52.240354  414292 type.go:168] "Request Body" body=""
	I1217 20:29:52.240429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:52.240749  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:52.240807  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:52.740175  414292 type.go:168] "Request Body" body=""
	I1217 20:29:52.740270  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:52.740621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:53.240207  414292 type.go:168] "Request Body" body=""
	I1217 20:29:53.240318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:53.240664  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:53.740074  414292 type.go:168] "Request Body" body=""
	I1217 20:29:53.740144  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:53.740420  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:54.240102  414292 type.go:168] "Request Body" body=""
	I1217 20:29:54.240179  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:54.240492  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:54.740171  414292 type.go:168] "Request Body" body=""
	I1217 20:29:54.740266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:54.740580  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:54.740639  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:55.240282  414292 type.go:168] "Request Body" body=""
	I1217 20:29:55.240356  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:55.240697  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:55.740173  414292 type.go:168] "Request Body" body=""
	I1217 20:29:55.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:55.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:56.240062  414292 type.go:168] "Request Body" body=""
	I1217 20:29:56.240140  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:56.240485  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:56.740174  414292 type.go:168] "Request Body" body=""
	I1217 20:29:56.740243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:56.740524  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:57.240191  414292 type.go:168] "Request Body" body=""
	I1217 20:29:57.240286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:57.240658  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:57.240715  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:57.740239  414292 type.go:168] "Request Body" body=""
	I1217 20:29:57.740339  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:57.740672  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:58.240460  414292 type.go:168] "Request Body" body=""
	I1217 20:29:58.240543  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:58.240844  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:58.740210  414292 type.go:168] "Request Body" body=""
	I1217 20:29:58.740304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:58.740643  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:59.240481  414292 type.go:168] "Request Body" body=""
	I1217 20:29:59.240553  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:59.240919  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:59.240975  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:59.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:29:59.740429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:59.740722  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:00.240635  414292 type.go:168] "Request Body" body=""
	I1217 20:30:00.240720  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:00.241049  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:00.741056  414292 type.go:168] "Request Body" body=""
	I1217 20:30:00.741139  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:00.741446  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:01.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:30:01.240243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:01.240569  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:01.740198  414292 type.go:168] "Request Body" body=""
	I1217 20:30:01.740313  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:01.740647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:01.740706  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:02.240225  414292 type.go:168] "Request Body" body=""
	I1217 20:30:02.240325  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:02.240684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:02.740220  414292 type.go:168] "Request Body" body=""
	I1217 20:30:02.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:02.740599  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:03.240198  414292 type.go:168] "Request Body" body=""
	I1217 20:30:03.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:03.240638  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:03.740375  414292 type.go:168] "Request Body" body=""
	I1217 20:30:03.740447  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:03.740781  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:03.740851  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:04.240827  414292 type.go:168] "Request Body" body=""
	I1217 20:30:04.240905  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:04.241246  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:04.741038  414292 type.go:168] "Request Body" body=""
	I1217 20:30:04.741117  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:04.741482  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:05.240086  414292 type.go:168] "Request Body" body=""
	I1217 20:30:05.240166  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:05.240523  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:05.740218  414292 type.go:168] "Request Body" body=""
	I1217 20:30:05.740313  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:05.740583  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:06.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:30:06.240296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:06.240608  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:06.240657  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:06.740166  414292 type.go:168] "Request Body" body=""
	I1217 20:30:06.740292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:06.740641  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:07.240137  414292 type.go:168] "Request Body" body=""
	I1217 20:30:07.240232  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:07.240542  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:07.740212  414292 type.go:168] "Request Body" body=""
	I1217 20:30:07.740319  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:07.740677  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:08.240572  414292 type.go:168] "Request Body" body=""
	I1217 20:30:08.240703  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:08.241012  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:08.241060  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:08.740470  414292 type.go:168] "Request Body" body=""
	I1217 20:30:08.740547  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:08.740807  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:09.240838  414292 type.go:168] "Request Body" body=""
	I1217 20:30:09.240937  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:09.241278  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:09.741105  414292 type.go:168] "Request Body" body=""
	I1217 20:30:09.741196  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:09.741522  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:10.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:30:10.240237  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:10.240593  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:10.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:30:10.740286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:10.740639  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:10.740692  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:11.240192  414292 type.go:168] "Request Body" body=""
	I1217 20:30:11.240314  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:11.240666  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:11.740282  414292 type.go:168] "Request Body" body=""
	I1217 20:30:11.740357  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:11.740632  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:12.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:30:12.240296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:12.240602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:12.740302  414292 type.go:168] "Request Body" body=""
	I1217 20:30:12.740382  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:12.740745  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:12.740799  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:13.240268  414292 type.go:168] "Request Body" body=""
	I1217 20:30:13.240338  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:13.240649  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:13.740420  414292 type.go:168] "Request Body" body=""
	I1217 20:30:13.740503  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:13.740905  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:14.240722  414292 type.go:168] "Request Body" body=""
	I1217 20:30:14.240802  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:14.241096  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:14.740830  414292 type.go:168] "Request Body" body=""
	I1217 20:30:14.740904  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:14.741180  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:14.741236  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:15.241024  414292 type.go:168] "Request Body" body=""
	I1217 20:30:15.241104  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:15.241450  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:15.741152  414292 type.go:168] "Request Body" body=""
	I1217 20:30:15.741234  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:15.741523  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:16.240202  414292 type.go:168] "Request Body" body=""
	I1217 20:30:16.240287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:16.240602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:16.740191  414292 type.go:168] "Request Body" body=""
	I1217 20:30:16.740289  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:16.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:17.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:30:17.240275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:17.240605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:17.240659  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:17.740234  414292 type.go:168] "Request Body" body=""
	I1217 20:30:17.740318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:17.740651  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:18.240499  414292 type.go:168] "Request Body" body=""
	I1217 20:30:18.240576  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:18.240897  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:18.740209  414292 type.go:168] "Request Body" body=""
	I1217 20:30:18.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:18.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:19.240591  414292 type.go:168] "Request Body" body=""
	I1217 20:30:19.240664  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:19.240919  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:19.240962  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:19.740705  414292 type.go:168] "Request Body" body=""
	I1217 20:30:19.740783  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:19.741126  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:20.240841  414292 type.go:168] "Request Body" body=""
	I1217 20:30:20.240934  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:20.241283  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:20.741051  414292 type.go:168] "Request Body" body=""
	I1217 20:30:20.741126  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:20.741393  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:21.240114  414292 type.go:168] "Request Body" body=""
	I1217 20:30:21.240193  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:21.240534  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:21.740204  414292 type.go:168] "Request Body" body=""
	I1217 20:30:21.740298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:21.740636  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:21.740690  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:22.240114  414292 type.go:168] "Request Body" body=""
	I1217 20:30:22.240185  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:22.240483  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:22.740163  414292 type.go:168] "Request Body" body=""
	I1217 20:30:22.740266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:22.740598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:23.240329  414292 type.go:168] "Request Body" body=""
	I1217 20:30:23.240410  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:23.240708  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:23.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:30:23.740421  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:23.740715  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:23.740754  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:24.240749  414292 type.go:168] "Request Body" body=""
	I1217 20:30:24.240827  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:24.241165  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:24.740945  414292 type.go:168] "Request Body" body=""
	I1217 20:30:24.741018  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:24.741341  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:25.241107  414292 type.go:168] "Request Body" body=""
	I1217 20:30:25.241181  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:25.241513  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:25.740206  414292 type.go:168] "Request Body" body=""
	I1217 20:30:25.740297  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:25.740647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:26.240231  414292 type.go:168] "Request Body" body=""
	I1217 20:30:26.240323  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:26.240661  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:26.240712  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:26.740311  414292 type.go:168] "Request Body" body=""
	I1217 20:30:26.740386  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:26.740706  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:27.240152  414292 type.go:168] "Request Body" body=""
	I1217 20:30:27.240224  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:27.240591  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:27.740314  414292 type.go:168] "Request Body" body=""
	I1217 20:30:27.740399  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:27.740736  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:28.240510  414292 type.go:168] "Request Body" body=""
	I1217 20:30:28.240580  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:28.240845  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:28.240885  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:28.740684  414292 type.go:168] "Request Body" body=""
	I1217 20:30:28.740769  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:28.741122  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:29.240868  414292 type.go:168] "Request Body" body=""
	I1217 20:30:29.240943  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:29.241278  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:29.741010  414292 type.go:168] "Request Body" body=""
	I1217 20:30:29.741088  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:29.741347  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:30.240074  414292 type.go:168] "Request Body" body=""
	I1217 20:30:30.240156  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:30.240536  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:30.740270  414292 type.go:168] "Request Body" body=""
	I1217 20:30:30.740350  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:30.740682  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:30.740742  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:31.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:30:31.240426  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:31.240709  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:31.740175  414292 type.go:168] "Request Body" body=""
	I1217 20:30:31.740274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:31.740607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:32.240196  414292 type.go:168] "Request Body" body=""
	I1217 20:30:32.240290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:32.240644  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:32.740323  414292 type.go:168] "Request Body" body=""
	I1217 20:30:32.740399  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:32.740721  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:32.740777  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:33.240191  414292 type.go:168] "Request Body" body=""
	I1217 20:30:33.240282  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:33.240581  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:33.740289  414292 type.go:168] "Request Body" body=""
	I1217 20:30:33.740384  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:33.740725  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:34.240516  414292 type.go:168] "Request Body" body=""
	I1217 20:30:34.240601  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:34.240877  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:34.740166  414292 type.go:168] "Request Body" body=""
	I1217 20:30:34.740263  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:34.740598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:35.240329  414292 type.go:168] "Request Body" body=""
	I1217 20:30:35.240407  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:35.240744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:35.240807  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:35.740356  414292 type.go:168] "Request Body" body=""
	I1217 20:30:35.740430  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:35.740684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:36.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:30:36.240292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:36.240620  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:36.740204  414292 type.go:168] "Request Body" body=""
	I1217 20:30:36.740299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:36.740621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:37.240083  414292 type.go:168] "Request Body" body=""
	I1217 20:30:37.240154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:37.240449  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:37.740150  414292 type.go:168] "Request Body" body=""
	I1217 20:30:37.740225  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:37.740559  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:37.740619  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:38.240565  414292 type.go:168] "Request Body" body=""
	I1217 20:30:38.240642  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:38.240952  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:38.740678  414292 type.go:168] "Request Body" body=""
	I1217 20:30:38.740753  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:38.741008  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:39.241038  414292 type.go:168] "Request Body" body=""
	I1217 20:30:39.241115  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:39.241418  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:39.740131  414292 type.go:168] "Request Body" body=""
	I1217 20:30:39.740209  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:39.740536  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:40.240118  414292 type.go:168] "Request Body" body=""
	I1217 20:30:40.240199  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:40.240498  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:40.240543  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:40.740108  414292 type.go:168] "Request Body" body=""
	I1217 20:30:40.740188  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:40.740547  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:41.240309  414292 type.go:168] "Request Body" body=""
	I1217 20:30:41.240393  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:41.240720  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:41.740366  414292 type.go:168] "Request Body" body=""
	I1217 20:30:41.740436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:41.740701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:42.240206  414292 type.go:168] "Request Body" body=""
	I1217 20:30:42.240307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:42.240670  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:42.240729  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:42.740417  414292 type.go:168] "Request Body" body=""
	I1217 20:30:42.740498  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:42.740825  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:43.240360  414292 type.go:168] "Request Body" body=""
	I1217 20:30:43.240441  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:43.240782  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:43.740183  414292 type.go:168] "Request Body" body=""
	I1217 20:30:43.740276  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:43.740607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:44.240187  414292 type.go:168] "Request Body" body=""
	I1217 20:30:44.240292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:44.240612  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:44.740076  414292 type.go:168] "Request Body" body=""
	I1217 20:30:44.740144  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:44.740421  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:44.740464  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:45.240192  414292 type.go:168] "Request Body" body=""
	I1217 20:30:45.240361  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:45.240784  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:45.740213  414292 type.go:168] "Request Body" body=""
	I1217 20:30:45.740320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:45.740704  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:46.241036  414292 type.go:168] "Request Body" body=""
	I1217 20:30:46.241111  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:46.241363  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:46.741131  414292 type.go:168] "Request Body" body=""
	I1217 20:30:46.741206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:46.741497  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:46.741543  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:47.240089  414292 type.go:168] "Request Body" body=""
	I1217 20:30:47.240162  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:47.240507  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:47.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:30:47.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:47.740533  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:48.240579  414292 type.go:168] "Request Body" body=""
	I1217 20:30:48.240662  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:48.240968  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:48.740776  414292 type.go:168] "Request Body" body=""
	I1217 20:30:48.740848  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:48.741156  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:49.241082  414292 type.go:168] "Request Body" body=""
	I1217 20:30:49.241169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:49.241428  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:49.241479  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:49.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:30:49.740280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:49.740617  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:50.240370  414292 type.go:168] "Request Body" body=""
	I1217 20:30:50.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:50.240786  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:50.740363  414292 type.go:168] "Request Body" body=""
	I1217 20:30:50.740431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:50.740703  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:51.240208  414292 type.go:168] "Request Body" body=""
	I1217 20:30:51.240317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:51.240620  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:51.740346  414292 type.go:168] "Request Body" body=""
	I1217 20:30:51.740425  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:51.740761  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:51.740821  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:52.240214  414292 type.go:168] "Request Body" body=""
	I1217 20:30:52.240300  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:52.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:52.740220  414292 type.go:168] "Request Body" body=""
	I1217 20:30:52.740317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:52.740663  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:53.240242  414292 type.go:168] "Request Body" body=""
	I1217 20:30:53.240341  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:53.240677  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:53.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:30:53.740434  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:53.740762  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:54.240735  414292 type.go:168] "Request Body" body=""
	I1217 20:30:54.240807  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:54.241141  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:54.241194  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:54.740836  414292 type.go:168] "Request Body" body=""
	I1217 20:30:54.740927  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:54.741298  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:55.241043  414292 type.go:168] "Request Body" body=""
	I1217 20:30:55.241118  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:55.241396  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:55.740103  414292 type.go:168] "Request Body" body=""
	I1217 20:30:55.740188  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:55.740556  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:56.240179  414292 type.go:168] "Request Body" body=""
	I1217 20:30:56.240272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:56.240684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:56.740352  414292 type.go:168] "Request Body" body=""
	I1217 20:30:56.740424  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:56.740685  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:56.740726  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:57.240185  414292 type.go:168] "Request Body" body=""
	I1217 20:30:57.240275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:57.240613  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:57.740358  414292 type.go:168] "Request Body" body=""
	I1217 20:30:57.740443  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:57.740790  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:58.240510  414292 type.go:168] "Request Body" body=""
	I1217 20:30:58.240581  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:58.240843  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:58.740171  414292 type.go:168] "Request Body" body=""
	I1217 20:30:58.740269  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:58.740606  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:59.240200  414292 type.go:168] "Request Body" body=""
	I1217 20:30:59.240287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:59.240645  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:59.240707  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:59.740354  414292 type.go:168] "Request Body" body=""
	I1217 20:30:59.740428  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:59.740694  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:00.240295  414292 type.go:168] "Request Body" body=""
	I1217 20:31:00.240376  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:00.240702  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:00.740561  414292 type.go:168] "Request Body" body=""
	I1217 20:31:00.740646  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:00.741014  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:01.240812  414292 type.go:168] "Request Body" body=""
	I1217 20:31:01.240892  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:01.241169  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:01.241213  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:01.740960  414292 type.go:168] "Request Body" body=""
	I1217 20:31:01.741043  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:01.741371  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:02.240103  414292 type.go:168] "Request Body" body=""
	I1217 20:31:02.240187  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:02.240566  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:02.740227  414292 type.go:168] "Request Body" body=""
	I1217 20:31:02.740315  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:02.740637  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:03.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:31:03.240291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:03.240590  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:03.740279  414292 type.go:168] "Request Body" body=""
	I1217 20:31:03.740349  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:03.740684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:03.740743  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:04.240506  414292 type.go:168] "Request Body" body=""
	I1217 20:31:04.240579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:04.240829  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:04.740200  414292 type.go:168] "Request Body" body=""
	I1217 20:31:04.740299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:04.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:05.240188  414292 type.go:168] "Request Body" body=""
	I1217 20:31:05.240285  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:05.240600  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:05.740119  414292 type.go:168] "Request Body" body=""
	I1217 20:31:05.740198  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:05.740527  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:06.240220  414292 type.go:168] "Request Body" body=""
	I1217 20:31:06.240320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:06.240652  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:06.240704  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:06.740395  414292 type.go:168] "Request Body" body=""
	I1217 20:31:06.740474  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:06.740826  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:07.240362  414292 type.go:168] "Request Body" body=""
	I1217 20:31:07.240437  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:07.240699  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:07.740382  414292 type.go:168] "Request Body" body=""
	I1217 20:31:07.740456  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:07.740780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:08.240694  414292 type.go:168] "Request Body" body=""
	I1217 20:31:08.240775  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:08.241125  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:08.241178  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:08.740933  414292 type.go:168] "Request Body" body=""
	I1217 20:31:08.741009  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:08.741272  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:09.240107  414292 type.go:168] "Request Body" body=""
	I1217 20:31:09.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:09.240509  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:09.740211  414292 type.go:168] "Request Body" body=""
	I1217 20:31:09.740317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:09.740653  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:10.240147  414292 type.go:168] "Request Body" body=""
	I1217 20:31:10.240221  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:10.240567  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:10.740283  414292 type.go:168] "Request Body" body=""
	I1217 20:31:10.740362  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:10.740720  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:10.740781  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:11.240308  414292 type.go:168] "Request Body" body=""
	I1217 20:31:11.240387  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:11.240742  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:11.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:31:11.740427  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:11.740686  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:12.240168  414292 type.go:168] "Request Body" body=""
	I1217 20:31:12.240265  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:12.240582  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:12.740305  414292 type.go:168] "Request Body" body=""
	I1217 20:31:12.740382  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:12.740717  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:13.240360  414292 type.go:168] "Request Body" body=""
	I1217 20:31:13.240435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:13.240753  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:13.240804  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:13.740494  414292 type.go:168] "Request Body" body=""
	I1217 20:31:13.740566  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:13.740865  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:14.240695  414292 type.go:168] "Request Body" body=""
	I1217 20:31:14.240775  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:14.241120  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:14.740883  414292 type.go:168] "Request Body" body=""
	I1217 20:31:14.740949  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:14.741207  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:15.241001  414292 type.go:168] "Request Body" body=""
	I1217 20:31:15.241086  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:15.241424  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:15.241480  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:15.740181  414292 type.go:168] "Request Body" body=""
	I1217 20:31:15.740286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:15.740606  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:16.240120  414292 type.go:168] "Request Body" body=""
	I1217 20:31:16.240190  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:16.240504  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:16.740165  414292 type.go:168] "Request Body" body=""
	I1217 20:31:16.740263  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:16.740588  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:17.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:31:17.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:17.240612  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:17.740176  414292 type.go:168] "Request Body" body=""
	I1217 20:31:17.740245  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:17.740595  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:17.740646  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:18.240578  414292 type.go:168] "Request Body" body=""
	I1217 20:31:18.240656  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:18.241010  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:18.740197  414292 type.go:168] "Request Body" body=""
	I1217 20:31:18.740299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:18.740622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:19.240589  414292 type.go:168] "Request Body" body=""
	I1217 20:31:19.240665  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:19.240973  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:19.740527  414292 type.go:168] "Request Body" body=""
	I1217 20:31:19.740602  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:19.740938  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:19.741004  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:20.240746  414292 type.go:168] "Request Body" body=""
	I1217 20:31:20.240826  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:20.241171  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:20.740847  414292 type.go:168] "Request Body" body=""
	I1217 20:31:20.740927  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:20.741205  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:21.240999  414292 type.go:168] "Request Body" body=""
	I1217 20:31:21.241077  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:21.241433  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:21.741076  414292 type.go:168] "Request Body" body=""
	I1217 20:31:21.741157  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:21.741476  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:21.741535  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:22.240148  414292 type.go:168] "Request Body" body=""
	I1217 20:31:22.240225  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:22.240595  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:22.740191  414292 type.go:168] "Request Body" body=""
	I1217 20:31:22.740288  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:22.740671  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:23.240361  414292 type.go:168] "Request Body" body=""
	I1217 20:31:23.240439  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:23.240766  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:23.740367  414292 type.go:168] "Request Body" body=""
	I1217 20:31:23.740436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:23.740727  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:24.240746  414292 type.go:168] "Request Body" body=""
	I1217 20:31:24.240834  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:24.241219  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:24.241275  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:24.740840  414292 type.go:168] "Request Body" body=""
	I1217 20:31:24.740915  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:24.741224  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:25.240994  414292 type.go:168] "Request Body" body=""
	I1217 20:31:25.241066  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:25.241325  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:25.741146  414292 type.go:168] "Request Body" body=""
	I1217 20:31:25.741238  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:25.741600  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:26.240173  414292 type.go:168] "Request Body" body=""
	I1217 20:31:26.240274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:26.240566  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:26.740222  414292 type.go:168] "Request Body" body=""
	I1217 20:31:26.740302  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:26.740563  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:26.740604  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:27.240174  414292 type.go:168] "Request Body" body=""
	I1217 20:31:27.240265  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:27.240586  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:27.740174  414292 type.go:168] "Request Body" body=""
	I1217 20:31:27.740267  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:27.740588  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:28.240508  414292 type.go:168] "Request Body" body=""
	I1217 20:31:28.240579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:28.240847  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:28.740580  414292 type.go:168] "Request Body" body=""
	I1217 20:31:28.740654  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:28.740974  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:28.741030  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:29.240927  414292 type.go:168] "Request Body" body=""
	I1217 20:31:29.241003  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:29.241345  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:29.740932  414292 type.go:168] "Request Body" body=""
	I1217 20:31:29.741003  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:29.741297  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:30.240066  414292 type.go:168] "Request Body" body=""
	I1217 20:31:30.240144  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:30.240477  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:30.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:31:30.740297  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:30.740655  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:31.240152  414292 type.go:168] "Request Body" body=""
	I1217 20:31:31.240227  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:31.240525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:31.240572  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:31.740177  414292 type.go:168] "Request Body" body=""
	I1217 20:31:31.740274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:31.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:32.240364  414292 type.go:168] "Request Body" body=""
	I1217 20:31:32.240441  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:32.240793  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:32.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:31:32.740429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:32.740739  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:33.240174  414292 type.go:168] "Request Body" body=""
	I1217 20:31:33.240265  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:33.240586  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:33.240635  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:33.740239  414292 type.go:168] "Request Body" body=""
	I1217 20:31:33.740336  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:33.740654  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:34.240597  414292 type.go:168] "Request Body" body=""
	I1217 20:31:34.240677  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:34.240945  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:34.740715  414292 type.go:168] "Request Body" body=""
	I1217 20:31:34.740794  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:34.741113  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:35.240931  414292 type.go:168] "Request Body" body=""
	I1217 20:31:35.241005  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:35.241378  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:35.241431  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:35.740086  414292 type.go:168] "Request Body" body=""
	I1217 20:31:35.740156  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:35.740458  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:36.240166  414292 type.go:168] "Request Body" body=""
	I1217 20:31:36.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:36.240589  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:36.740185  414292 type.go:168] "Request Body" body=""
	I1217 20:31:36.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:36.740585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:37.240233  414292 type.go:168] "Request Body" body=""
	I1217 20:31:37.240320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:37.240565  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:37.740177  414292 type.go:168] "Request Body" body=""
	I1217 20:31:37.740273  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:37.740567  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:37.740616  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:38.240625  414292 type.go:168] "Request Body" body=""
	I1217 20:31:38.240697  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:38.241070  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:38.740867  414292 type.go:168] "Request Body" body=""
	I1217 20:31:38.740936  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:38.741194  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:39.240063  414292 type.go:168] "Request Body" body=""
	I1217 20:31:39.240204  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:39.240542  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:39.740275  414292 type.go:168] "Request Body" body=""
	I1217 20:31:39.740351  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:39.740669  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:39.740728  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:40.240374  414292 type.go:168] "Request Body" body=""
	I1217 20:31:40.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:40.240701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:40.740211  414292 type.go:168] "Request Body" body=""
	I1217 20:31:40.740308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:40.740679  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:41.240409  414292 type.go:168] "Request Body" body=""
	I1217 20:31:41.240499  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:41.240858  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:41.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:31:41.740455  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:41.740717  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:41.740768  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:42.240201  414292 type.go:168] "Request Body" body=""
	I1217 20:31:42.240311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:42.240703  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:42.740571  414292 type.go:168] "Request Body" body=""
	I1217 20:31:42.740645  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:42.740967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:43.240727  414292 type.go:168] "Request Body" body=""
	I1217 20:31:43.240796  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:43.241050  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:43.740827  414292 type.go:168] "Request Body" body=""
	I1217 20:31:43.740901  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:43.741236  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:43.741293  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:44.241091  414292 type.go:168] "Request Body" body=""
	I1217 20:31:44.241176  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:44.241525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:44.740194  414292 type.go:168] "Request Body" body=""
	I1217 20:31:44.740280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:44.745967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=5
	I1217 20:31:45.240798  414292 type.go:168] "Request Body" body=""
	I1217 20:31:45.240901  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:45.241310  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:45.741143  414292 type.go:168] "Request Body" body=""
	I1217 20:31:45.741226  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:45.741583  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:45.741646  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:46.241073  414292 type.go:168] "Request Body" body=""
	I1217 20:31:46.241146  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:46.241399  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:46.740173  414292 type.go:168] "Request Body" body=""
	I1217 20:31:46.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:46.740602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:47.240190  414292 type.go:168] "Request Body" body=""
	I1217 20:31:47.240283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:47.240589  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:47.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:31:47.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:47.740649  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:48.240470  414292 type.go:168] "Request Body" body=""
	I1217 20:31:48.240554  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:48.241013  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:48.241064  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:48.740173  414292 type.go:168] "Request Body" body=""
	I1217 20:31:48.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:48.740603  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:49.240608  414292 type.go:168] "Request Body" body=""
	I1217 20:31:49.240675  414292 node_ready.go:38] duration metric: took 6m0.000721639s for node "functional-682596" to be "Ready" ...
	I1217 20:31:49.243794  414292 out.go:203] 
	W1217 20:31:49.246551  414292 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1217 20:31:49.246575  414292 out.go:285] * 
	W1217 20:31:49.249079  414292 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1217 20:31:49.251429  414292 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 17 20:31:56 functional-682596 containerd[5330]: time="2025-12-17T20:31:56.842985708Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:31:57 functional-682596 containerd[5330]: time="2025-12-17T20:31:57.883089157Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Dec 17 20:31:57 functional-682596 containerd[5330]: time="2025-12-17T20:31:57.885250624Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Dec 17 20:31:57 functional-682596 containerd[5330]: time="2025-12-17T20:31:57.893417768Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:31:57 functional-682596 containerd[5330]: time="2025-12-17T20:31:57.894067556Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:31:58 functional-682596 containerd[5330]: time="2025-12-17T20:31:58.853428363Z" level=info msg="No images store for sha256:f8359c2c10bc3fa09ea92f06d2cc7d3c863814f8c0b38cad60a5f93eb6785f57"
	Dec 17 20:31:58 functional-682596 containerd[5330]: time="2025-12-17T20:31:58.855759350Z" level=info msg="ImageCreate event name:\"docker.io/library/minikube-local-cache-test:functional-682596\""
	Dec 17 20:31:58 functional-682596 containerd[5330]: time="2025-12-17T20:31:58.862693152Z" level=info msg="ImageCreate event name:\"sha256:05258a74f07dd17944d5b57da11e1219f05ceba6a54a10e2544b7da8ff43103b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:31:58 functional-682596 containerd[5330]: time="2025-12-17T20:31:58.863308362Z" level=info msg="ImageUpdate event name:\"docker.io/library/minikube-local-cache-test:functional-682596\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:31:59 functional-682596 containerd[5330]: time="2025-12-17T20:31:59.639741021Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\""
	Dec 17 20:31:59 functional-682596 containerd[5330]: time="2025-12-17T20:31:59.642083864Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:latest\""
	Dec 17 20:31:59 functional-682596 containerd[5330]: time="2025-12-17T20:31:59.644000209Z" level=info msg="ImageDelete event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\""
	Dec 17 20:31:59 functional-682596 containerd[5330]: time="2025-12-17T20:31:59.656111207Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\" returns successfully"
	Dec 17 20:32:00 functional-682596 containerd[5330]: time="2025-12-17T20:32:00.814254431Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\""
	Dec 17 20:32:00 functional-682596 containerd[5330]: time="2025-12-17T20:32:00.816646826Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:3.1\""
	Dec 17 20:32:00 functional-682596 containerd[5330]: time="2025-12-17T20:32:00.819891340Z" level=info msg="ImageDelete event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\""
	Dec 17 20:32:00 functional-682596 containerd[5330]: time="2025-12-17T20:32:00.826461085Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\" returns successfully"
	Dec 17 20:32:00 functional-682596 containerd[5330]: time="2025-12-17T20:32:00.992051432Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Dec 17 20:32:00 functional-682596 containerd[5330]: time="2025-12-17T20:32:00.994318919Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Dec 17 20:32:01 functional-682596 containerd[5330]: time="2025-12-17T20:32:01.001983551Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:32:01 functional-682596 containerd[5330]: time="2025-12-17T20:32:01.002925501Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:32:01 functional-682596 containerd[5330]: time="2025-12-17T20:32:01.129645302Z" level=info msg="No images store for sha256:3ac89611d5efd8eb74174b1f04c33b7e73b651cec35b5498caf0cfdd2efd7d48"
	Dec 17 20:32:01 functional-682596 containerd[5330]: time="2025-12-17T20:32:01.132017331Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.1\""
	Dec 17 20:32:01 functional-682596 containerd[5330]: time="2025-12-17T20:32:01.142971508Z" level=info msg="ImageCreate event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:32:01 functional-682596 containerd[5330]: time="2025-12-17T20:32:01.143462870Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:32:02.940655    9278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:32:02.941381    9278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:32:02.943003    9278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:32:02.943649    9278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:32:02.945267    9278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec17 17:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015536] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.514164] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034184] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.806183] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.649674] kauditd_printk_skb: 36 callbacks suppressed
	[Dec17 19:37] hrtimer: interrupt took 15014583 ns
	[Dec17 19:39] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:06] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:17] FS-Cache: Duplicate cookie detected
	[  +0.000767] FS-Cache: O-cookie c=00000031 [p=00000002 fl=222 nc=0 na=1]
	[  +0.001036] FS-Cache: O-cookie d=00000000b1f70094{9P.session} n=000000004124fba5
	[  +0.001177] FS-Cache: O-key=[10] '34323937353834383437'
	[  +0.000816] FS-Cache: N-cookie c=00000032 [p=00000002 fl=2 nc=0 na=1]
	[  +0.001043] FS-Cache: N-cookie d=00000000b1f70094{9P.session} n=000000009cece4cf
	[  +0.001160] FS-Cache: N-key=[10] '34323937353834383437'
	
	
	==> kernel <==
	 20:32:02 up  3:14,  0 user,  load average: 0.29, 0.31, 0.77
	Linux functional-682596 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 17 20:31:59 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:32:00 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 824.
	Dec 17 20:32:00 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:32:00 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:32:00 functional-682596 kubelet[9070]: E1217 20:32:00.550241    9070 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:32:00 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:32:00 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:32:01 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 825.
	Dec 17 20:32:01 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:32:01 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:32:01 functional-682596 kubelet[9157]: E1217 20:32:01.307669    9157 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:32:01 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:32:01 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:32:01 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 826.
	Dec 17 20:32:01 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:32:01 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:32:02 functional-682596 kubelet[9191]: E1217 20:32:02.040035    9191 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:32:02 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:32:02 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:32:02 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 827.
	Dec 17 20:32:02 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:32:02 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:32:02 functional-682596 kubelet[9242]: E1217 20:32:02.815164    9242 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:32:02 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:32:02 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596: exit status 2 (347.461468ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-682596" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd (2.34s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly (2.41s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly
functional_test.go:756: (dbg) Run:  out/kubectl --context functional-682596 get pods
functional_test.go:756: (dbg) Non-zero exit: out/kubectl --context functional-682596 get pods: exit status 1 (113.110572ms)

                                                
                                                
** stderr ** 
	E1217 20:32:04.041791  419754 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:32:04.042181  419754 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:32:04.043660  419754 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:32:04.044001  419754 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:32:04.045484  419754 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:759: failed to run kubectl directly. args "out/kubectl --context functional-682596 get pods": exit status 1
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-682596
helpers_test.go:244: (dbg) docker inspect functional-682596:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	        "Created": "2025-12-17T20:17:26.774929696Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 408854,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-17T20:17:26.844564666Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:2a6398fc76fc21dc0a77ac54600c2604c101bff52e66ecf65f88ec0f1a8cff2d",
	        "ResolvConfPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hostname",
	        "HostsPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hosts",
	        "LogPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77-json.log",
	        "Name": "/functional-682596",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-682596:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-682596",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	                "LowerDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268-init/diff:/var/lib/docker/overlay2/83c8e6311894730d80a5439b5d4991744e9cfa6d0015df9caca346d57baf92e8/diff",
	                "MergedDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/merged",
	                "UpperDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/diff",
	                "WorkDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-682596",
	                "Source": "/var/lib/docker/volumes/functional-682596/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-682596",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-682596",
	                "name.minikube.sigs.k8s.io": "functional-682596",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "8e0f8d4915f888f90df7adb000bd0e749885d304e33053e85751193487b627b9",
	            "SandboxKey": "/var/run/docker/netns/8e0f8d4915f8",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33163"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33164"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33167"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33165"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33166"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-682596": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "de:95:c1:d9:d4:32",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "9e66e4dbc8284f728f81715f37c51d8272e96fcac9fb378874c982b3077b6cc2",
	                    "EndpointID": "0db3c56cfb2be75c981ed53adcc07de7cd33db60d51c01b0e875c8d41cf02897",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-682596",
	                        "efc9468a7e55"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596: exit status 2 (316.205518ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                         ARGS                                                                          │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image   │ functional-032730 image ls --format yaml --alsologtostderr                                                                                            │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image   │ functional-032730 image ls --format short --alsologtostderr                                                                                           │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image   │ functional-032730 image ls --format json --alsologtostderr                                                                                            │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image   │ functional-032730 image ls --format table --alsologtostderr                                                                                           │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ ssh     │ functional-032730 ssh pgrep buildkitd                                                                                                                 │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │                     │
	│ image   │ functional-032730 image build -t localhost/my-image:functional-032730 testdata/build --alsologtostderr                                                │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image   │ functional-032730 image ls                                                                                                                            │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ delete  │ -p functional-032730                                                                                                                                  │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ start   │ -p functional-682596 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1 │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │                     │
	│ start   │ -p functional-682596 --alsologtostderr -v=8                                                                                                           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:25 UTC │                     │
	│ cache   │ functional-682596 cache add registry.k8s.io/pause:3.1                                                                                                 │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ functional-682596 cache add registry.k8s.io/pause:3.3                                                                                                 │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ functional-682596 cache add registry.k8s.io/pause:latest                                                                                              │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ functional-682596 cache add minikube-local-cache-test:functional-682596                                                                               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ functional-682596 cache delete minikube-local-cache-test:functional-682596                                                                            │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                      │ minikube          │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ list                                                                                                                                                  │ minikube          │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ ssh     │ functional-682596 ssh sudo crictl images                                                                                                              │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ ssh     │ functional-682596 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                    │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ ssh     │ functional-682596 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │                     │
	│ cache   │ functional-682596 cache reload                                                                                                                        │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ ssh     │ functional-682596 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                      │ minikube          │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                   │ minikube          │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ kubectl │ functional-682596 kubectl -- --context functional-682596 get pods                                                                                     │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │                     │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/17 20:25:44
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1217 20:25:44.045489  414292 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:25:44.045686  414292 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:25:44.045714  414292 out.go:374] Setting ErrFile to fd 2...
	I1217 20:25:44.045733  414292 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:25:44.046029  414292 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:25:44.046470  414292 out.go:368] Setting JSON to false
	I1217 20:25:44.047409  414292 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":11289,"bootTime":1765991855,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:25:44.047515  414292 start.go:143] virtualization:  
	I1217 20:25:44.053027  414292 out.go:179] * [functional-682596] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1217 20:25:44.056011  414292 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 20:25:44.056093  414292 notify.go:221] Checking for updates...
	I1217 20:25:44.061883  414292 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:25:44.064833  414292 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:44.067589  414292 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:25:44.070446  414292 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 20:25:44.073380  414292 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 20:25:44.076968  414292 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:25:44.077128  414292 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:25:44.112208  414292 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:25:44.112455  414292 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:25:44.167112  414292 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 20:25:44.158029599 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:25:44.167209  414292 docker.go:319] overlay module found
	I1217 20:25:44.170171  414292 out.go:179] * Using the docker driver based on existing profile
	I1217 20:25:44.173086  414292 start.go:309] selected driver: docker
	I1217 20:25:44.173109  414292 start.go:927] validating driver "docker" against &{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableC
oreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:25:44.173214  414292 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 20:25:44.173330  414292 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:25:44.234258  414292 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 20:25:44.225129855 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:25:44.234785  414292 cni.go:84] Creating CNI manager for ""
	I1217 20:25:44.234848  414292 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:25:44.234909  414292 start.go:353] cluster config:
	{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath:
StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:25:44.238034  414292 out.go:179] * Starting "functional-682596" primary control-plane node in "functional-682596" cluster
	I1217 20:25:44.240853  414292 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1217 20:25:44.243760  414292 out.go:179] * Pulling base image v0.0.48-1765661130-22141 ...
	I1217 20:25:44.246713  414292 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:25:44.246768  414292 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1217 20:25:44.246782  414292 cache.go:65] Caching tarball of preloaded images
	I1217 20:25:44.246797  414292 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon
	I1217 20:25:44.246869  414292 preload.go:238] Found /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1217 20:25:44.246880  414292 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1217 20:25:44.246994  414292 profile.go:143] Saving config to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/config.json ...
	I1217 20:25:44.265764  414292 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon, skipping pull
	I1217 20:25:44.265789  414292 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 exists in daemon, skipping load
	I1217 20:25:44.265812  414292 cache.go:243] Successfully downloaded all kic artifacts
	I1217 20:25:44.265841  414292 start.go:360] acquireMachinesLock for functional-682596: {Name:mk49b95a4c72eb2d15a1ae0f35918a9843d0b3df Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1217 20:25:44.265903  414292 start.go:364] duration metric: took 36.013µs to acquireMachinesLock for "functional-682596"
	I1217 20:25:44.265927  414292 start.go:96] Skipping create...Using existing machine configuration
	I1217 20:25:44.265936  414292 fix.go:54] fixHost starting: 
	I1217 20:25:44.266187  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:44.282574  414292 fix.go:112] recreateIfNeeded on functional-682596: state=Running err=<nil>
	W1217 20:25:44.282603  414292 fix.go:138] unexpected machine state, will restart: <nil>
	I1217 20:25:44.285918  414292 out.go:252] * Updating the running docker "functional-682596" container ...
	I1217 20:25:44.285950  414292 machine.go:94] provisionDockerMachine start ...
	I1217 20:25:44.286031  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:44.302759  414292 main.go:143] libmachine: Using SSH client type: native
	I1217 20:25:44.303096  414292 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:25:44.303111  414292 main.go:143] libmachine: About to run SSH command:
	hostname
	I1217 20:25:44.431913  414292 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:25:44.431939  414292 ubuntu.go:182] provisioning hostname "functional-682596"
	I1217 20:25:44.432002  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:44.450770  414292 main.go:143] libmachine: Using SSH client type: native
	I1217 20:25:44.451117  414292 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:25:44.451136  414292 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-682596 && echo "functional-682596" | sudo tee /etc/hostname
	I1217 20:25:44.601580  414292 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:25:44.601732  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:44.619103  414292 main.go:143] libmachine: Using SSH client type: native
	I1217 20:25:44.619412  414292 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:25:44.619435  414292 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-682596' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-682596/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-682596' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1217 20:25:44.748545  414292 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1217 20:25:44.748571  414292 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21808-367595/.minikube CaCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21808-367595/.minikube}
	I1217 20:25:44.748593  414292 ubuntu.go:190] setting up certificates
	I1217 20:25:44.748603  414292 provision.go:84] configureAuth start
	I1217 20:25:44.748675  414292 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:25:44.766057  414292 provision.go:143] copyHostCerts
	I1217 20:25:44.766100  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem
	I1217 20:25:44.766141  414292 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem, removing ...
	I1217 20:25:44.766152  414292 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem
	I1217 20:25:44.766226  414292 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem (1082 bytes)
	I1217 20:25:44.766327  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem
	I1217 20:25:44.766347  414292 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem, removing ...
	I1217 20:25:44.766357  414292 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem
	I1217 20:25:44.766385  414292 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem (1123 bytes)
	I1217 20:25:44.766441  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem
	I1217 20:25:44.766461  414292 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem, removing ...
	I1217 20:25:44.766471  414292 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem
	I1217 20:25:44.766501  414292 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem (1679 bytes)
	I1217 20:25:44.766561  414292 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem org=jenkins.functional-682596 san=[127.0.0.1 192.168.49.2 functional-682596 localhost minikube]
	I1217 20:25:45.107844  414292 provision.go:177] copyRemoteCerts
	I1217 20:25:45.108657  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1217 20:25:45.108873  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.149674  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.277212  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1217 20:25:45.277284  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1217 20:25:45.298737  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1217 20:25:45.298796  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1217 20:25:45.320659  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1217 20:25:45.320720  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I1217 20:25:45.338755  414292 provision.go:87] duration metric: took 590.128101ms to configureAuth
	I1217 20:25:45.338800  414292 ubuntu.go:206] setting minikube options for container-runtime
	I1217 20:25:45.338978  414292 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:25:45.339040  414292 machine.go:97] duration metric: took 1.053082119s to provisionDockerMachine
	I1217 20:25:45.339048  414292 start.go:293] postStartSetup for "functional-682596" (driver="docker")
	I1217 20:25:45.339059  414292 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1217 20:25:45.339122  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1217 20:25:45.339165  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.356059  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.452345  414292 ssh_runner.go:195] Run: cat /etc/os-release
	I1217 20:25:45.455946  414292 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1217 20:25:45.455965  414292 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1217 20:25:45.455970  414292 command_runner.go:130] > VERSION_ID="12"
	I1217 20:25:45.455975  414292 command_runner.go:130] > VERSION="12 (bookworm)"
	I1217 20:25:45.455980  414292 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1217 20:25:45.455983  414292 command_runner.go:130] > ID=debian
	I1217 20:25:45.455989  414292 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1217 20:25:45.455994  414292 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1217 20:25:45.456008  414292 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1217 20:25:45.456046  414292 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1217 20:25:45.456062  414292 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1217 20:25:45.456073  414292 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/addons for local assets ...
	I1217 20:25:45.456130  414292 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/files for local assets ...
	I1217 20:25:45.456208  414292 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> 3694612.pem in /etc/ssl/certs
	I1217 20:25:45.456215  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> /etc/ssl/certs/3694612.pem
	I1217 20:25:45.456308  414292 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts -> hosts in /etc/test/nested/copy/369461
	I1217 20:25:45.456313  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts -> /etc/test/nested/copy/369461/hosts
	I1217 20:25:45.456356  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/369461
	I1217 20:25:45.464083  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:25:45.481460  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts --> /etc/test/nested/copy/369461/hosts (40 bytes)
	I1217 20:25:45.500420  414292 start.go:296] duration metric: took 161.357637ms for postStartSetup
	I1217 20:25:45.500542  414292 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1217 20:25:45.500615  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.517677  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.609195  414292 command_runner.go:130] > 18%
	I1217 20:25:45.609800  414292 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1217 20:25:45.614741  414292 command_runner.go:130] > 159G
	I1217 20:25:45.614774  414292 fix.go:56] duration metric: took 1.348835133s for fixHost
	I1217 20:25:45.614785  414292 start.go:83] releasing machines lock for "functional-682596", held for 1.348870218s
	I1217 20:25:45.614866  414292 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:25:45.631621  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:25:45.631685  414292 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:25:45.631702  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:25:45.631735  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:25:45.631767  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:25:45.631798  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:25:45.631848  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:25:45.631888  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.631907  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.631926  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem -> /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.631943  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:25:45.631995  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:45.649517  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:45.754346  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:25:45.772163  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:25:45.789636  414292 ssh_runner.go:195] Run: openssl version
	I1217 20:25:45.795706  414292 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1217 20:25:45.796203  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.803937  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:25:45.811516  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.815311  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.815389  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.815474  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:25:45.856132  414292 command_runner.go:130] > 3ec20f2e
	I1217 20:25:45.856705  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:25:45.864064  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.871519  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:25:45.879293  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.883196  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.883238  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.883306  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:45.924322  414292 command_runner.go:130] > b5213941
	I1217 20:25:45.924802  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:25:45.932259  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.939603  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:25:45.947311  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.950955  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.951320  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.951411  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:25:45.993968  414292 command_runner.go:130] > 51391683
	I1217 20:25:45.994167  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:25:46.002855  414292 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-certificates >/dev/null 2>&1 && sudo update-ca-certificates || true"
	I1217 20:25:46.007551  414292 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-trust >/dev/null 2>&1 && sudo update-ca-trust extract || true"
	I1217 20:25:46.011748  414292 ssh_runner.go:195] Run: cat /version.json
	I1217 20:25:46.011837  414292 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1217 20:25:46.016112  414292 command_runner.go:130] > {"iso_version": "v1.37.0-1765579389-22117", "kicbase_version": "v0.0.48-1765661130-22141", "minikube_version": "v1.37.0", "commit": "cbb33128a244032d08f8fc6e6c9f03b30f0da3e4"}
	I1217 20:25:46.018576  414292 ssh_runner.go:195] Run: systemctl --version
	I1217 20:25:46.126907  414292 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1217 20:25:46.127016  414292 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1217 20:25:46.127060  414292 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1217 20:25:46.127172  414292 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1217 20:25:46.131726  414292 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1217 20:25:46.131887  414292 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1217 20:25:46.131965  414292 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1217 20:25:46.140024  414292 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1217 20:25:46.140047  414292 start.go:496] detecting cgroup driver to use...
	I1217 20:25:46.140078  414292 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1217 20:25:46.140156  414292 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1217 20:25:46.155753  414292 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1217 20:25:46.168916  414292 docker.go:218] disabling cri-docker service (if available) ...
	I1217 20:25:46.169009  414292 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1217 20:25:46.184457  414292 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1217 20:25:46.197441  414292 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1217 20:25:46.302684  414292 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1217 20:25:46.421553  414292 docker.go:234] disabling docker service ...
	I1217 20:25:46.421621  414292 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1217 20:25:46.436823  414292 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1217 20:25:46.449890  414292 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1217 20:25:46.565021  414292 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1217 20:25:46.678341  414292 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1217 20:25:46.693104  414292 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1217 20:25:46.705993  414292 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1217 20:25:46.707385  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1217 20:25:46.716410  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1217 20:25:46.724756  414292 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1217 20:25:46.724876  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1217 20:25:46.733647  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:25:46.742030  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1217 20:25:46.750673  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:25:46.759312  414292 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1217 20:25:46.768595  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1217 20:25:46.777345  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1217 20:25:46.786196  414292 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1217 20:25:46.795479  414292 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1217 20:25:46.802392  414292 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1217 20:25:46.803423  414292 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1217 20:25:46.811004  414292 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:25:46.926090  414292 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1217 20:25:47.068989  414292 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1217 20:25:47.069169  414292 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1217 20:25:47.073250  414292 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1217 20:25:47.073355  414292 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1217 20:25:47.073385  414292 command_runner.go:130] > Device: 0,72	Inode: 1618        Links: 1
	I1217 20:25:47.073441  414292 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1217 20:25:47.073470  414292 command_runner.go:130] > Access: 2025-12-17 20:25:47.016473578 +0000
	I1217 20:25:47.073512  414292 command_runner.go:130] > Modify: 2025-12-17 20:25:47.016473578 +0000
	I1217 20:25:47.073542  414292 command_runner.go:130] > Change: 2025-12-17 20:25:47.016473578 +0000
	I1217 20:25:47.073561  414292 command_runner.go:130] >  Birth: -
	I1217 20:25:47.073923  414292 start.go:564] Will wait 60s for crictl version
	I1217 20:25:47.074046  414292 ssh_runner.go:195] Run: which crictl
	I1217 20:25:47.077775  414292 command_runner.go:130] > /usr/local/bin/crictl
	I1217 20:25:47.078218  414292 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1217 20:25:47.104139  414292 command_runner.go:130] > Version:  0.1.0
	I1217 20:25:47.104225  414292 command_runner.go:130] > RuntimeName:  containerd
	I1217 20:25:47.104269  414292 command_runner.go:130] > RuntimeVersion:  v2.2.0
	I1217 20:25:47.104295  414292 command_runner.go:130] > RuntimeApiVersion:  v1
	I1217 20:25:47.106475  414292 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1217 20:25:47.106628  414292 ssh_runner.go:195] Run: containerd --version
	I1217 20:25:47.130403  414292 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1217 20:25:47.132698  414292 ssh_runner.go:195] Run: containerd --version
	I1217 20:25:47.152199  414292 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1217 20:25:47.159813  414292 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1217 20:25:47.162759  414292 cli_runner.go:164] Run: docker network inspect functional-682596 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1217 20:25:47.179237  414292 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1217 20:25:47.183476  414292 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1217 20:25:47.183701  414292 kubeadm.go:884] updating cluster {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false C
ustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1217 20:25:47.183825  414292 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:25:47.183890  414292 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:25:47.207538  414292 command_runner.go:130] > {
	I1217 20:25:47.207560  414292 command_runner.go:130] >   "images":  [
	I1217 20:25:47.207564  414292 command_runner.go:130] >     {
	I1217 20:25:47.207574  414292 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1217 20:25:47.207582  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207588  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1217 20:25:47.207591  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207595  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207607  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1217 20:25:47.207614  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207618  414292 command_runner.go:130] >       "size":  "40636774",
	I1217 20:25:47.207625  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207630  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207636  414292 command_runner.go:130] >     },
	I1217 20:25:47.207639  414292 command_runner.go:130] >     {
	I1217 20:25:47.207647  414292 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1217 20:25:47.207655  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207660  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1217 20:25:47.207664  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207668  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207678  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1217 20:25:47.207684  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207688  414292 command_runner.go:130] >       "size":  "8034419",
	I1217 20:25:47.207692  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207696  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207698  414292 command_runner.go:130] >     },
	I1217 20:25:47.207702  414292 command_runner.go:130] >     {
	I1217 20:25:47.207709  414292 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1217 20:25:47.207715  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207720  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1217 20:25:47.207735  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207747  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207756  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1217 20:25:47.207759  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207763  414292 command_runner.go:130] >       "size":  "21168808",
	I1217 20:25:47.207766  414292 command_runner.go:130] >       "username":  "nonroot",
	I1217 20:25:47.207770  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207773  414292 command_runner.go:130] >     },
	I1217 20:25:47.207776  414292 command_runner.go:130] >     {
	I1217 20:25:47.207783  414292 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1217 20:25:47.207787  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207791  414292 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1217 20:25:47.207795  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207798  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207806  414292 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1217 20:25:47.207809  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207813  414292 command_runner.go:130] >       "size":  "21749640",
	I1217 20:25:47.207817  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.207822  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.207826  414292 command_runner.go:130] >       },
	I1217 20:25:47.207833  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207837  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207842  414292 command_runner.go:130] >     },
	I1217 20:25:47.207846  414292 command_runner.go:130] >     {
	I1217 20:25:47.207853  414292 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1217 20:25:47.207859  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207865  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1217 20:25:47.207867  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207872  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207886  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1217 20:25:47.207890  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207894  414292 command_runner.go:130] >       "size":  "24692223",
	I1217 20:25:47.207897  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.207906  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.207915  414292 command_runner.go:130] >       },
	I1217 20:25:47.207928  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.207932  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.207934  414292 command_runner.go:130] >     },
	I1217 20:25:47.207938  414292 command_runner.go:130] >     {
	I1217 20:25:47.207947  414292 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1217 20:25:47.207955  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.207961  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1217 20:25:47.207964  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207968  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.207976  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1217 20:25:47.207982  414292 command_runner.go:130] >       ],
	I1217 20:25:47.207986  414292 command_runner.go:130] >       "size":  "20672157",
	I1217 20:25:47.207990  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.207997  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.208001  414292 command_runner.go:130] >       },
	I1217 20:25:47.208020  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208028  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.208032  414292 command_runner.go:130] >     },
	I1217 20:25:47.208035  414292 command_runner.go:130] >     {
	I1217 20:25:47.208042  414292 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1217 20:25:47.208049  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.208054  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1217 20:25:47.208058  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208062  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.208069  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1217 20:25:47.208074  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208079  414292 command_runner.go:130] >       "size":  "22432301",
	I1217 20:25:47.208082  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208088  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.208091  414292 command_runner.go:130] >     },
	I1217 20:25:47.208097  414292 command_runner.go:130] >     {
	I1217 20:25:47.208104  414292 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1217 20:25:47.208114  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.208120  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1217 20:25:47.208123  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208128  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.208142  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1217 20:25:47.208146  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208149  414292 command_runner.go:130] >       "size":  "15405535",
	I1217 20:25:47.208153  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.208157  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.208163  414292 command_runner.go:130] >       },
	I1217 20:25:47.208168  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208173  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.208177  414292 command_runner.go:130] >     },
	I1217 20:25:47.208183  414292 command_runner.go:130] >     {
	I1217 20:25:47.208189  414292 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1217 20:25:47.208195  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.208200  414292 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1217 20:25:47.208203  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208207  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.208215  414292 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1217 20:25:47.208221  414292 command_runner.go:130] >       ],
	I1217 20:25:47.208225  414292 command_runner.go:130] >       "size":  "267939",
	I1217 20:25:47.208229  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.208233  414292 command_runner.go:130] >         "value":  "65535"
	I1217 20:25:47.208237  414292 command_runner.go:130] >       },
	I1217 20:25:47.208240  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.208245  414292 command_runner.go:130] >       "pinned":  true
	I1217 20:25:47.208339  414292 command_runner.go:130] >     }
	I1217 20:25:47.208342  414292 command_runner.go:130] >   ]
	I1217 20:25:47.208344  414292 command_runner.go:130] > }
	I1217 20:25:47.208525  414292 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:25:47.208539  414292 containerd.go:534] Images already preloaded, skipping extraction
	I1217 20:25:47.208601  414292 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:25:47.230634  414292 command_runner.go:130] > {
	I1217 20:25:47.230653  414292 command_runner.go:130] >   "images":  [
	I1217 20:25:47.230659  414292 command_runner.go:130] >     {
	I1217 20:25:47.230668  414292 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1217 20:25:47.230673  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230679  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1217 20:25:47.230683  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230687  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230696  414292 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1217 20:25:47.230703  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230721  414292 command_runner.go:130] >       "size":  "40636774",
	I1217 20:25:47.230725  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.230729  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.230735  414292 command_runner.go:130] >     },
	I1217 20:25:47.230741  414292 command_runner.go:130] >     {
	I1217 20:25:47.230756  414292 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1217 20:25:47.230764  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230769  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1217 20:25:47.230773  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230786  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230798  414292 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1217 20:25:47.230801  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230812  414292 command_runner.go:130] >       "size":  "8034419",
	I1217 20:25:47.230816  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.230819  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.230823  414292 command_runner.go:130] >     },
	I1217 20:25:47.230826  414292 command_runner.go:130] >     {
	I1217 20:25:47.230833  414292 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1217 20:25:47.230839  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230844  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1217 20:25:47.230857  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230888  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230900  414292 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1217 20:25:47.230911  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230916  414292 command_runner.go:130] >       "size":  "21168808",
	I1217 20:25:47.230923  414292 command_runner.go:130] >       "username":  "nonroot",
	I1217 20:25:47.230927  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.230936  414292 command_runner.go:130] >     },
	I1217 20:25:47.230939  414292 command_runner.go:130] >     {
	I1217 20:25:47.230946  414292 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1217 20:25:47.230950  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.230954  414292 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1217 20:25:47.230960  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230964  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.230972  414292 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1217 20:25:47.230984  414292 command_runner.go:130] >       ],
	I1217 20:25:47.230988  414292 command_runner.go:130] >       "size":  "21749640",
	I1217 20:25:47.230991  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.230995  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.230998  414292 command_runner.go:130] >       },
	I1217 20:25:47.231003  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231009  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231012  414292 command_runner.go:130] >     },
	I1217 20:25:47.231018  414292 command_runner.go:130] >     {
	I1217 20:25:47.231024  414292 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1217 20:25:47.231037  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231042  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1217 20:25:47.231045  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231050  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231063  414292 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1217 20:25:47.231067  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231071  414292 command_runner.go:130] >       "size":  "24692223",
	I1217 20:25:47.231074  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231087  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.231093  414292 command_runner.go:130] >       },
	I1217 20:25:47.231097  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231111  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231117  414292 command_runner.go:130] >     },
	I1217 20:25:47.231125  414292 command_runner.go:130] >     {
	I1217 20:25:47.231132  414292 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1217 20:25:47.231138  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231144  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1217 20:25:47.231151  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231155  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231164  414292 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1217 20:25:47.231168  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231172  414292 command_runner.go:130] >       "size":  "20672157",
	I1217 20:25:47.231178  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231194  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.231200  414292 command_runner.go:130] >       },
	I1217 20:25:47.231204  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231208  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231211  414292 command_runner.go:130] >     },
	I1217 20:25:47.231214  414292 command_runner.go:130] >     {
	I1217 20:25:47.231223  414292 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1217 20:25:47.231238  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231246  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1217 20:25:47.231250  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231254  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231264  414292 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1217 20:25:47.231276  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231280  414292 command_runner.go:130] >       "size":  "22432301",
	I1217 20:25:47.231284  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231288  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231291  414292 command_runner.go:130] >     },
	I1217 20:25:47.231294  414292 command_runner.go:130] >     {
	I1217 20:25:47.231309  414292 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1217 20:25:47.231317  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231323  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1217 20:25:47.231333  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231337  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231347  414292 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1217 20:25:47.231359  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231363  414292 command_runner.go:130] >       "size":  "15405535",
	I1217 20:25:47.231366  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231370  414292 command_runner.go:130] >         "value":  "0"
	I1217 20:25:47.231373  414292 command_runner.go:130] >       },
	I1217 20:25:47.231379  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231392  414292 command_runner.go:130] >       "pinned":  false
	I1217 20:25:47.231395  414292 command_runner.go:130] >     },
	I1217 20:25:47.231405  414292 command_runner.go:130] >     {
	I1217 20:25:47.231412  414292 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1217 20:25:47.231418  414292 command_runner.go:130] >       "repoTags":  [
	I1217 20:25:47.231423  414292 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1217 20:25:47.231428  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231437  414292 command_runner.go:130] >       "repoDigests":  [
	I1217 20:25:47.231445  414292 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1217 20:25:47.231448  414292 command_runner.go:130] >       ],
	I1217 20:25:47.231452  414292 command_runner.go:130] >       "size":  "267939",
	I1217 20:25:47.231455  414292 command_runner.go:130] >       "uid":  {
	I1217 20:25:47.231459  414292 command_runner.go:130] >         "value":  "65535"
	I1217 20:25:47.231462  414292 command_runner.go:130] >       },
	I1217 20:25:47.231466  414292 command_runner.go:130] >       "username":  "",
	I1217 20:25:47.231469  414292 command_runner.go:130] >       "pinned":  true
	I1217 20:25:47.231473  414292 command_runner.go:130] >     }
	I1217 20:25:47.231479  414292 command_runner.go:130] >   ]
	I1217 20:25:47.231482  414292 command_runner.go:130] > }
	I1217 20:25:47.233897  414292 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:25:47.233919  414292 cache_images.go:86] Images are preloaded, skipping loading
	I1217 20:25:47.233928  414292 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1217 20:25:47.234041  414292 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-682596 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1217 20:25:47.234107  414292 ssh_runner.go:195] Run: sudo crictl info
	I1217 20:25:47.256786  414292 command_runner.go:130] > {
	I1217 20:25:47.256808  414292 command_runner.go:130] >   "cniconfig": {
	I1217 20:25:47.256814  414292 command_runner.go:130] >     "Networks": [
	I1217 20:25:47.256818  414292 command_runner.go:130] >       {
	I1217 20:25:47.256823  414292 command_runner.go:130] >         "Config": {
	I1217 20:25:47.256827  414292 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1217 20:25:47.256833  414292 command_runner.go:130] >           "Name": "cni-loopback",
	I1217 20:25:47.256837  414292 command_runner.go:130] >           "Plugins": [
	I1217 20:25:47.256840  414292 command_runner.go:130] >             {
	I1217 20:25:47.256846  414292 command_runner.go:130] >               "Network": {
	I1217 20:25:47.256851  414292 command_runner.go:130] >                 "ipam": {},
	I1217 20:25:47.256863  414292 command_runner.go:130] >                 "type": "loopback"
	I1217 20:25:47.256875  414292 command_runner.go:130] >               },
	I1217 20:25:47.256880  414292 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1217 20:25:47.256883  414292 command_runner.go:130] >             }
	I1217 20:25:47.256887  414292 command_runner.go:130] >           ],
	I1217 20:25:47.256896  414292 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1217 20:25:47.256900  414292 command_runner.go:130] >         },
	I1217 20:25:47.256911  414292 command_runner.go:130] >         "IFName": "lo"
	I1217 20:25:47.256917  414292 command_runner.go:130] >       }
	I1217 20:25:47.256920  414292 command_runner.go:130] >     ],
	I1217 20:25:47.256924  414292 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1217 20:25:47.256927  414292 command_runner.go:130] >     "PluginDirs": [
	I1217 20:25:47.256932  414292 command_runner.go:130] >       "/opt/cni/bin"
	I1217 20:25:47.256941  414292 command_runner.go:130] >     ],
	I1217 20:25:47.256945  414292 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1217 20:25:47.256949  414292 command_runner.go:130] >     "Prefix": "eth"
	I1217 20:25:47.256952  414292 command_runner.go:130] >   },
	I1217 20:25:47.256957  414292 command_runner.go:130] >   "config": {
	I1217 20:25:47.256962  414292 command_runner.go:130] >     "cdiSpecDirs": [
	I1217 20:25:47.256965  414292 command_runner.go:130] >       "/etc/cdi",
	I1217 20:25:47.256969  414292 command_runner.go:130] >       "/var/run/cdi"
	I1217 20:25:47.256977  414292 command_runner.go:130] >     ],
	I1217 20:25:47.256985  414292 command_runner.go:130] >     "cni": {
	I1217 20:25:47.256991  414292 command_runner.go:130] >       "binDir": "",
	I1217 20:25:47.256995  414292 command_runner.go:130] >       "binDirs": [
	I1217 20:25:47.256999  414292 command_runner.go:130] >         "/opt/cni/bin"
	I1217 20:25:47.257003  414292 command_runner.go:130] >       ],
	I1217 20:25:47.257008  414292 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1217 20:25:47.257025  414292 command_runner.go:130] >       "confTemplate": "",
	I1217 20:25:47.257029  414292 command_runner.go:130] >       "ipPref": "",
	I1217 20:25:47.257033  414292 command_runner.go:130] >       "maxConfNum": 1,
	I1217 20:25:47.257040  414292 command_runner.go:130] >       "setupSerially": false,
	I1217 20:25:47.257044  414292 command_runner.go:130] >       "useInternalLoopback": false
	I1217 20:25:47.257049  414292 command_runner.go:130] >     },
	I1217 20:25:47.257057  414292 command_runner.go:130] >     "containerd": {
	I1217 20:25:47.257061  414292 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1217 20:25:47.257069  414292 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1217 20:25:47.257076  414292 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1217 20:25:47.257080  414292 command_runner.go:130] >       "runtimes": {
	I1217 20:25:47.257084  414292 command_runner.go:130] >         "runc": {
	I1217 20:25:47.257097  414292 command_runner.go:130] >           "ContainerAnnotations": null,
	I1217 20:25:47.257102  414292 command_runner.go:130] >           "PodAnnotations": null,
	I1217 20:25:47.257106  414292 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1217 20:25:47.257111  414292 command_runner.go:130] >           "cgroupWritable": false,
	I1217 20:25:47.257119  414292 command_runner.go:130] >           "cniConfDir": "",
	I1217 20:25:47.257123  414292 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1217 20:25:47.257127  414292 command_runner.go:130] >           "io_type": "",
	I1217 20:25:47.257133  414292 command_runner.go:130] >           "options": {
	I1217 20:25:47.257139  414292 command_runner.go:130] >             "BinaryName": "",
	I1217 20:25:47.257143  414292 command_runner.go:130] >             "CriuImagePath": "",
	I1217 20:25:47.257148  414292 command_runner.go:130] >             "CriuWorkPath": "",
	I1217 20:25:47.257154  414292 command_runner.go:130] >             "IoGid": 0,
	I1217 20:25:47.257158  414292 command_runner.go:130] >             "IoUid": 0,
	I1217 20:25:47.257162  414292 command_runner.go:130] >             "NoNewKeyring": false,
	I1217 20:25:47.257174  414292 command_runner.go:130] >             "Root": "",
	I1217 20:25:47.257186  414292 command_runner.go:130] >             "ShimCgroup": "",
	I1217 20:25:47.257193  414292 command_runner.go:130] >             "SystemdCgroup": false
	I1217 20:25:47.257196  414292 command_runner.go:130] >           },
	I1217 20:25:47.257206  414292 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1217 20:25:47.257213  414292 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1217 20:25:47.257217  414292 command_runner.go:130] >           "runtimePath": "",
	I1217 20:25:47.257224  414292 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1217 20:25:47.257229  414292 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1217 20:25:47.257233  414292 command_runner.go:130] >           "snapshotter": ""
	I1217 20:25:47.257238  414292 command_runner.go:130] >         }
	I1217 20:25:47.257241  414292 command_runner.go:130] >       }
	I1217 20:25:47.257246  414292 command_runner.go:130] >     },
	I1217 20:25:47.257261  414292 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1217 20:25:47.257269  414292 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1217 20:25:47.257274  414292 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1217 20:25:47.257280  414292 command_runner.go:130] >     "disableApparmor": false,
	I1217 20:25:47.257290  414292 command_runner.go:130] >     "disableHugetlbController": true,
	I1217 20:25:47.257294  414292 command_runner.go:130] >     "disableProcMount": false,
	I1217 20:25:47.257299  414292 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1217 20:25:47.257303  414292 command_runner.go:130] >     "enableCDI": true,
	I1217 20:25:47.257309  414292 command_runner.go:130] >     "enableSelinux": false,
	I1217 20:25:47.257313  414292 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1217 20:25:47.257318  414292 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1217 20:25:47.257325  414292 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1217 20:25:47.257331  414292 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1217 20:25:47.257336  414292 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1217 20:25:47.257340  414292 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1217 20:25:47.257353  414292 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1217 20:25:47.257358  414292 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1217 20:25:47.257362  414292 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1217 20:25:47.257368  414292 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1217 20:25:47.257375  414292 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1217 20:25:47.257379  414292 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1217 20:25:47.257386  414292 command_runner.go:130] >   },
	I1217 20:25:47.257390  414292 command_runner.go:130] >   "features": {
	I1217 20:25:47.257396  414292 command_runner.go:130] >     "supplemental_groups_policy": true
	I1217 20:25:47.257399  414292 command_runner.go:130] >   },
	I1217 20:25:47.257403  414292 command_runner.go:130] >   "golang": "go1.24.9",
	I1217 20:25:47.257416  414292 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1217 20:25:47.257429  414292 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1217 20:25:47.257433  414292 command_runner.go:130] >   "runtimeHandlers": [
	I1217 20:25:47.257436  414292 command_runner.go:130] >     {
	I1217 20:25:47.257447  414292 command_runner.go:130] >       "features": {
	I1217 20:25:47.257451  414292 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1217 20:25:47.257455  414292 command_runner.go:130] >         "user_namespaces": true
	I1217 20:25:47.257460  414292 command_runner.go:130] >       }
	I1217 20:25:47.257463  414292 command_runner.go:130] >     },
	I1217 20:25:47.257469  414292 command_runner.go:130] >     {
	I1217 20:25:47.257473  414292 command_runner.go:130] >       "features": {
	I1217 20:25:47.257477  414292 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1217 20:25:47.257481  414292 command_runner.go:130] >         "user_namespaces": true
	I1217 20:25:47.257484  414292 command_runner.go:130] >       },
	I1217 20:25:47.257488  414292 command_runner.go:130] >       "name": "runc"
	I1217 20:25:47.257494  414292 command_runner.go:130] >     }
	I1217 20:25:47.257497  414292 command_runner.go:130] >   ],
	I1217 20:25:47.257502  414292 command_runner.go:130] >   "status": {
	I1217 20:25:47.257506  414292 command_runner.go:130] >     "conditions": [
	I1217 20:25:47.257509  414292 command_runner.go:130] >       {
	I1217 20:25:47.257514  414292 command_runner.go:130] >         "message": "",
	I1217 20:25:47.257526  414292 command_runner.go:130] >         "reason": "",
	I1217 20:25:47.257530  414292 command_runner.go:130] >         "status": true,
	I1217 20:25:47.257536  414292 command_runner.go:130] >         "type": "RuntimeReady"
	I1217 20:25:47.257539  414292 command_runner.go:130] >       },
	I1217 20:25:47.257543  414292 command_runner.go:130] >       {
	I1217 20:25:47.257549  414292 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1217 20:25:47.257554  414292 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1217 20:25:47.257563  414292 command_runner.go:130] >         "status": false,
	I1217 20:25:47.257568  414292 command_runner.go:130] >         "type": "NetworkReady"
	I1217 20:25:47.257574  414292 command_runner.go:130] >       },
	I1217 20:25:47.257577  414292 command_runner.go:130] >       {
	I1217 20:25:47.257599  414292 command_runner.go:130] >         "message": "{\"io.containerd.deprecation/cgroup-v1\":\"The support for cgroup v1 is deprecated since containerd v2.2 and will be removed by no later than May 2029. Upgrade the host to use cgroup v2.\"}",
	I1217 20:25:47.257609  414292 command_runner.go:130] >         "reason": "ContainerdHasDeprecationWarnings",
	I1217 20:25:47.257615  414292 command_runner.go:130] >         "status": false,
	I1217 20:25:47.257620  414292 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1217 20:25:47.257626  414292 command_runner.go:130] >       }
	I1217 20:25:47.257629  414292 command_runner.go:130] >     ]
	I1217 20:25:47.257631  414292 command_runner.go:130] >   }
	I1217 20:25:47.257634  414292 command_runner.go:130] > }
	I1217 20:25:47.259959  414292 cni.go:84] Creating CNI manager for ""
	I1217 20:25:47.259981  414292 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:25:47.259991  414292 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1217 20:25:47.260020  414292 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-682596 NodeName:functional-682596 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Stati
cPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1217 20:25:47.260142  414292 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-682596"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1217 20:25:47.260216  414292 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1217 20:25:47.267498  414292 command_runner.go:130] > kubeadm
	I1217 20:25:47.267517  414292 command_runner.go:130] > kubectl
	I1217 20:25:47.267520  414292 command_runner.go:130] > kubelet
	I1217 20:25:47.268462  414292 binaries.go:51] Found k8s binaries, skipping transfer
	I1217 20:25:47.268563  414292 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1217 20:25:47.276438  414292 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1217 20:25:47.289778  414292 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1217 20:25:47.303155  414292 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2235 bytes)
	I1217 20:25:47.315864  414292 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1217 20:25:47.319319  414292 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1217 20:25:47.319605  414292 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:25:47.441462  414292 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1217 20:25:47.463080  414292 certs.go:69] Setting up /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596 for IP: 192.168.49.2
	I1217 20:25:47.463150  414292 certs.go:195] generating shared ca certs ...
	I1217 20:25:47.463190  414292 certs.go:227] acquiring lock for ca certs: {Name:mk528c7ee25f2f3d78de33f266a77f908cb2a9d0 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:47.463362  414292 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key
	I1217 20:25:47.463461  414292 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key
	I1217 20:25:47.463501  414292 certs.go:257] generating profile certs ...
	I1217 20:25:47.463662  414292 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key
	I1217 20:25:47.463774  414292 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key.0c30bf8d
	I1217 20:25:47.463860  414292 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key
	I1217 20:25:47.463894  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1217 20:25:47.463938  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1217 20:25:47.463977  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1217 20:25:47.464005  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1217 20:25:47.464049  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1217 20:25:47.464079  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1217 20:25:47.464117  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1217 20:25:47.464151  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1217 20:25:47.464241  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:25:47.464342  414292 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:25:47.464377  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:25:47.464421  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:25:47.464488  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:25:47.464541  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:25:47.464629  414292 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:25:47.464693  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.464733  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.464771  414292 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem -> /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.469220  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1217 20:25:47.495389  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1217 20:25:47.516308  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1217 20:25:47.535144  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1217 20:25:47.552466  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1217 20:25:47.570909  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1217 20:25:47.588173  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1217 20:25:47.606011  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1217 20:25:47.623433  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:25:47.640520  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:25:47.657751  414292 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:25:47.675695  414292 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1217 20:25:47.688487  414292 ssh_runner.go:195] Run: openssl version
	I1217 20:25:47.694560  414292 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1217 20:25:47.694946  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.702368  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:25:47.710124  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.713826  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.713858  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.713917  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:25:47.754917  414292 command_runner.go:130] > 3ec20f2e
	I1217 20:25:47.755445  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:25:47.763008  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.770327  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:25:47.778030  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.782014  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.782042  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.782099  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:25:47.822920  414292 command_runner.go:130] > b5213941
	I1217 20:25:47.823058  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:25:47.830582  414292 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.837906  414292 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:25:47.845640  414292 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.849463  414292 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.849531  414292 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.849600  414292 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:25:47.890040  414292 command_runner.go:130] > 51391683
	I1217 20:25:47.890555  414292 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:25:47.898150  414292 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1217 20:25:47.901790  414292 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1217 20:25:47.901872  414292 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1217 20:25:47.901887  414292 command_runner.go:130] > Device: 259,1	Inode: 1060771     Links: 1
	I1217 20:25:47.901895  414292 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1217 20:25:47.901902  414292 command_runner.go:130] > Access: 2025-12-17 20:21:41.033930957 +0000
	I1217 20:25:47.901907  414292 command_runner.go:130] > Modify: 2025-12-17 20:17:35.731490416 +0000
	I1217 20:25:47.901912  414292 command_runner.go:130] > Change: 2025-12-17 20:17:35.731490416 +0000
	I1217 20:25:47.901921  414292 command_runner.go:130] >  Birth: 2025-12-17 20:17:35.731490416 +0000
	I1217 20:25:47.901988  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1217 20:25:47.942293  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:47.942780  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1217 20:25:47.983019  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:47.983513  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1217 20:25:48.024341  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.024837  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1217 20:25:48.065771  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.066190  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1217 20:25:48.107223  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.107692  414292 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1217 20:25:48.148374  414292 command_runner.go:130] > Certificate will not expire
	I1217 20:25:48.148810  414292 kubeadm.go:401] StartCluster: {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Cust
omQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:25:48.148912  414292 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1217 20:25:48.148983  414292 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1217 20:25:48.175983  414292 cri.go:89] found id: ""
	I1217 20:25:48.176056  414292 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1217 20:25:48.182939  414292 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1217 20:25:48.182960  414292 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1217 20:25:48.182967  414292 command_runner.go:130] > /var/lib/minikube/etcd:
	I1217 20:25:48.183854  414292 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1217 20:25:48.183910  414292 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1217 20:25:48.183977  414292 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1217 20:25:48.191197  414292 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:25:48.191635  414292 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-682596" does not appear in /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.191740  414292 kubeconfig.go:62] /home/jenkins/minikube-integration/21808-367595/kubeconfig needs updating (will repair): [kubeconfig missing "functional-682596" cluster setting kubeconfig missing "functional-682596" context setting]
	I1217 20:25:48.192034  414292 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/kubeconfig: {Name:mk68b516071fc5d9da0842bf56ff4d318cea3c03 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:48.192565  414292 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.192744  414292 kapi.go:59] client config for functional-682596: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt", KeyFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key", CAFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil),
NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1217 20:25:48.193250  414292 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1217 20:25:48.193273  414292 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1217 20:25:48.193281  414292 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1217 20:25:48.193286  414292 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1217 20:25:48.193293  414292 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1217 20:25:48.193576  414292 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1217 20:25:48.193650  414292 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1217 20:25:48.201269  414292 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1217 20:25:48.201338  414292 kubeadm.go:602] duration metric: took 17.417602ms to restartPrimaryControlPlane
	I1217 20:25:48.201355  414292 kubeadm.go:403] duration metric: took 52.552362ms to StartCluster
	I1217 20:25:48.201370  414292 settings.go:142] acquiring lock: {Name:mkec67bf414aabef990098a6cc4910956f0d3622 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:48.201429  414292 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.202007  414292 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/kubeconfig: {Name:mk68b516071fc5d9da0842bf56ff4d318cea3c03 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:25:48.202208  414292 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1217 20:25:48.202539  414292 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:25:48.202581  414292 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1217 20:25:48.202699  414292 addons.go:70] Setting storage-provisioner=true in profile "functional-682596"
	I1217 20:25:48.202717  414292 addons.go:239] Setting addon storage-provisioner=true in "functional-682596"
	I1217 20:25:48.202742  414292 host.go:66] Checking if "functional-682596" exists ...
	I1217 20:25:48.202770  414292 addons.go:70] Setting default-storageclass=true in profile "functional-682596"
	I1217 20:25:48.202806  414292 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-682596"
	I1217 20:25:48.203165  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:48.203224  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:48.208687  414292 out.go:179] * Verifying Kubernetes components...
	I1217 20:25:48.211692  414292 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:25:48.230383  414292 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1217 20:25:48.233339  414292 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:48.233361  414292 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1217 20:25:48.233423  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:48.236813  414292 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:25:48.236975  414292 kapi.go:59] client config for functional-682596: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt", KeyFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key", CAFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil),
NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1217 20:25:48.237238  414292 addons.go:239] Setting addon default-storageclass=true in "functional-682596"
	I1217 20:25:48.237267  414292 host.go:66] Checking if "functional-682596" exists ...
	I1217 20:25:48.237711  414292 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:25:48.262897  414292 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:48.262919  414292 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1217 20:25:48.262996  414292 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:25:48.269972  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:48.294767  414292 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:25:48.418586  414292 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1217 20:25:48.450623  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:48.465245  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:49.239916  414292 node_ready.go:35] waiting up to 6m0s for node "functional-682596" to be "Ready" ...
	I1217 20:25:49.240030  414292 type.go:168] "Request Body" body=""
	I1217 20:25:49.240095  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:49.240342  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.240376  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240403  414292 retry.go:31] will retry after 252.350229ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240440  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.240459  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240479  414292 retry.go:31] will retry after 321.821783ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.240555  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:49.493033  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:49.547929  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.551638  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.551667  414292 retry.go:31] will retry after 328.531722ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.562869  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:49.621023  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.625124  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.625209  414292 retry.go:31] will retry after 442.103425ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.740481  414292 type.go:168] "Request Body" body=""
	I1217 20:25:49.740559  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:49.740872  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:49.881274  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:49.942102  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:49.945784  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:49.945890  414292 retry.go:31] will retry after 409.243705ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.068055  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:50.127397  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:50.131721  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.131759  414292 retry.go:31] will retry after 566.560423ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.241000  414292 type.go:168] "Request Body" body=""
	I1217 20:25:50.241077  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:50.241406  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:50.355732  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:50.414970  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:50.419857  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.419893  414292 retry.go:31] will retry after 763.212709ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.699479  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:50.741041  414292 type.go:168] "Request Body" body=""
	I1217 20:25:50.741134  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:50.741465  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:50.776772  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:50.776815  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:50.776839  414292 retry.go:31] will retry after 1.24877806s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:51.183473  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:51.240182  414292 type.go:168] "Request Body" body=""
	I1217 20:25:51.240277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:51.240545  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:51.240594  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:51.251909  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:51.255943  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:51.255983  414292 retry.go:31] will retry after 1.271740821s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:51.740532  414292 type.go:168] "Request Body" body=""
	I1217 20:25:51.740649  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:51.740974  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:52.026483  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:52.095052  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:52.095119  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.095140  414292 retry.go:31] will retry after 1.58694383s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.240356  414292 type.go:168] "Request Body" body=""
	I1217 20:25:52.240430  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:52.240682  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:52.528382  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:52.586445  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:52.590032  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.590066  414292 retry.go:31] will retry after 1.445188932s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:52.740386  414292 type.go:168] "Request Body" body=""
	I1217 20:25:52.740463  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:52.740818  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:53.240182  414292 type.go:168] "Request Body" body=""
	I1217 20:25:53.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:53.240604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:53.240660  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:53.682297  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:53.740043  414292 type.go:168] "Request Body" body=""
	I1217 20:25:53.740108  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:53.740352  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:53.743851  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:53.743882  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:53.743900  414292 retry.go:31] will retry after 2.69671946s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:54.036496  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:54.096053  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:54.096099  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:54.096122  414292 retry.go:31] will retry after 2.925706415s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:54.240487  414292 type.go:168] "Request Body" body=""
	I1217 20:25:54.240571  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:54.240903  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:54.740656  414292 type.go:168] "Request Body" body=""
	I1217 20:25:54.740752  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:54.741104  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:55.240849  414292 type.go:168] "Request Body" body=""
	I1217 20:25:55.240918  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:55.241169  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:55.241222  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:55.741059  414292 type.go:168] "Request Body" body=""
	I1217 20:25:55.741137  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:55.741444  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:56.240196  414292 type.go:168] "Request Body" body=""
	I1217 20:25:56.240318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:56.240645  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:56.440979  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:56.500702  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:56.500749  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:56.500767  414292 retry.go:31] will retry after 1.84810195s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:56.740117  414292 type.go:168] "Request Body" body=""
	I1217 20:25:56.740201  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:56.740503  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:57.023057  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:25:57.082954  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:57.083001  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:57.083020  414292 retry.go:31] will retry after 3.223759279s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:57.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:25:57.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:57.240558  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:57.740268  414292 type.go:168] "Request Body" body=""
	I1217 20:25:57.740347  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:57.740685  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:25:57.740756  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:25:58.240571  414292 type.go:168] "Request Body" body=""
	I1217 20:25:58.240660  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:58.240952  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:58.349268  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:25:58.403710  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:25:58.407286  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:58.407317  414292 retry.go:31] will retry after 3.305771044s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:25:58.740858  414292 type.go:168] "Request Body" body=""
	I1217 20:25:58.740936  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:58.741275  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:59.240111  414292 type.go:168] "Request Body" body=""
	I1217 20:25:59.240220  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:59.240560  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:25:59.740145  414292 type.go:168] "Request Body" body=""
	I1217 20:25:59.740223  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:25:59.740492  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:00.240307  414292 type.go:168] "Request Body" body=""
	I1217 20:26:00.240425  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:00.240806  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:00.240857  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:00.307216  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:00.372358  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:00.376526  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:00.376564  414292 retry.go:31] will retry after 8.003704403s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:00.740135  414292 type.go:168] "Request Body" body=""
	I1217 20:26:00.740216  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:00.740543  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:01.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:26:01.240281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:01.240535  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:01.713237  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:01.740945  414292 type.go:168] "Request Body" body=""
	I1217 20:26:01.741019  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:01.741278  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:01.769053  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:01.772711  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:01.772742  414292 retry.go:31] will retry after 3.267552643s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:02.240198  414292 type.go:168] "Request Body" body=""
	I1217 20:26:02.240302  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:02.240604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:02.740266  414292 type.go:168] "Request Body" body=""
	I1217 20:26:02.740336  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:02.740681  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:02.740769  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:03.240210  414292 type.go:168] "Request Body" body=""
	I1217 20:26:03.240299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:03.240622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:03.740228  414292 type.go:168] "Request Body" body=""
	I1217 20:26:03.740320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:03.740637  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:04.240516  414292 type.go:168] "Request Body" body=""
	I1217 20:26:04.240588  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:04.240943  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:04.740734  414292 type.go:168] "Request Body" body=""
	I1217 20:26:04.740811  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:04.741190  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:04.741246  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:05.040756  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:05.102503  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:05.102552  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:05.102572  414292 retry.go:31] will retry after 12.344413157s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:05.240841  414292 type.go:168] "Request Body" body=""
	I1217 20:26:05.240913  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:05.241244  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:05.740855  414292 type.go:168] "Request Body" body=""
	I1217 20:26:05.740930  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:05.741188  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:06.241036  414292 type.go:168] "Request Body" body=""
	I1217 20:26:06.241119  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:06.241411  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:06.740129  414292 type.go:168] "Request Body" body=""
	I1217 20:26:06.740212  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:06.740571  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:07.240279  414292 type.go:168] "Request Body" body=""
	I1217 20:26:07.240353  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:07.240608  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:07.240657  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:07.740195  414292 type.go:168] "Request Body" body=""
	I1217 20:26:07.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:07.740591  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:08.240525  414292 type.go:168] "Request Body" body=""
	I1217 20:26:08.240599  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:08.240914  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:08.381383  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:08.435212  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:08.439369  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:08.439410  414292 retry.go:31] will retry after 8.892819822s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:08.740968  414292 type.go:168] "Request Body" body=""
	I1217 20:26:08.741065  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:08.741390  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:09.240148  414292 type.go:168] "Request Body" body=""
	I1217 20:26:09.240230  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:09.240616  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:09.740331  414292 type.go:168] "Request Body" body=""
	I1217 20:26:09.740408  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:09.740742  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:09.740801  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:10.240362  414292 type.go:168] "Request Body" body=""
	I1217 20:26:10.240435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:10.240780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:10.740200  414292 type.go:168] "Request Body" body=""
	I1217 20:26:10.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:10.740646  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:11.240216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:11.240308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:11.240651  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:11.740164  414292 type.go:168] "Request Body" body=""
	I1217 20:26:11.740235  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:11.740510  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:12.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:26:12.240283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:12.240625  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:12.240683  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:12.740202  414292 type.go:168] "Request Body" body=""
	I1217 20:26:12.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:12.740622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:13.240171  414292 type.go:168] "Request Body" body=""
	I1217 20:26:13.240266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:13.240526  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:13.740189  414292 type.go:168] "Request Body" body=""
	I1217 20:26:13.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:13.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:14.240635  414292 type.go:168] "Request Body" body=""
	I1217 20:26:14.240715  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:14.241059  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:14.241125  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:14.741075  414292 type.go:168] "Request Body" body=""
	I1217 20:26:14.741149  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:14.741406  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:15.240079  414292 type.go:168] "Request Body" body=""
	I1217 20:26:15.240155  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:15.240494  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:15.740230  414292 type.go:168] "Request Body" body=""
	I1217 20:26:15.740334  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:15.740683  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:16.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:26:16.240429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:16.240695  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:16.740381  414292 type.go:168] "Request Body" body=""
	I1217 20:26:16.740456  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:16.740780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:16.740834  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:17.240356  414292 type.go:168] "Request Body" body=""
	I1217 20:26:17.240434  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:17.240777  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:17.333063  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:17.388410  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:17.391967  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.391995  414292 retry.go:31] will retry after 13.113728844s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.447345  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:17.505124  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:17.505163  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.505182  414292 retry.go:31] will retry after 11.452403849s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:17.740560  414292 type.go:168] "Request Body" body=""
	I1217 20:26:17.740629  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:17.740885  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:18.240553  414292 type.go:168] "Request Body" body=""
	I1217 20:26:18.240633  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:18.240967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:18.740512  414292 type.go:168] "Request Body" body=""
	I1217 20:26:18.740589  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:18.740904  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:18.740955  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:19.240885  414292 type.go:168] "Request Body" body=""
	I1217 20:26:19.240962  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:19.241213  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:19.741015  414292 type.go:168] "Request Body" body=""
	I1217 20:26:19.741087  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:19.741404  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:20.240182  414292 type.go:168] "Request Body" body=""
	I1217 20:26:20.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:20.240627  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:20.740108  414292 type.go:168] "Request Body" body=""
	I1217 20:26:20.740183  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:20.740453  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:21.240216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:21.240318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:21.240698  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:21.240754  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:21.740194  414292 type.go:168] "Request Body" body=""
	I1217 20:26:21.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:21.740628  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:22.240933  414292 type.go:168] "Request Body" body=""
	I1217 20:26:22.241005  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:22.241257  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:22.741110  414292 type.go:168] "Request Body" body=""
	I1217 20:26:22.741224  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:22.741585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:23.240288  414292 type.go:168] "Request Body" body=""
	I1217 20:26:23.240369  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:23.240662  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:23.741096  414292 type.go:168] "Request Body" body=""
	I1217 20:26:23.741162  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:23.741447  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:23.741492  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:24.240108  414292 type.go:168] "Request Body" body=""
	I1217 20:26:24.240184  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:24.240503  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:24.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:24.740312  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:24.740605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:25.240111  414292 type.go:168] "Request Body" body=""
	I1217 20:26:25.240206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:25.240472  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:25.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:25.740317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:25.740610  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:26.240350  414292 type.go:168] "Request Body" body=""
	I1217 20:26:26.240423  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:26.240796  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:26.240856  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:26.740354  414292 type.go:168] "Request Body" body=""
	I1217 20:26:26.740433  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:26.740693  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:27.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:26:27.240285  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:27.240571  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:27.740306  414292 type.go:168] "Request Body" body=""
	I1217 20:26:27.740387  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:27.740718  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:28.240518  414292 type.go:168] "Request Body" body=""
	I1217 20:26:28.240588  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:28.240860  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:28.240905  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:28.740699  414292 type.go:168] "Request Body" body=""
	I1217 20:26:28.740776  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:28.741110  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:28.958534  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:29.018842  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:29.024509  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:29.024543  414292 retry.go:31] will retry after 28.006345092s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:29.241080  414292 type.go:168] "Request Body" body=""
	I1217 20:26:29.241159  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:29.241493  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:29.740997  414292 type.go:168] "Request Body" body=""
	I1217 20:26:29.741065  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:29.741356  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:30.241045  414292 type.go:168] "Request Body" body=""
	I1217 20:26:30.241120  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:30.241435  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:30.241493  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:30.505938  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:26:30.574101  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:30.574147  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:30.574166  414292 retry.go:31] will retry after 31.982210322s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:30.740490  414292 type.go:168] "Request Body" body=""
	I1217 20:26:30.740579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:30.740933  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:31.240692  414292 type.go:168] "Request Body" body=""
	I1217 20:26:31.240768  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:31.248432  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=7
	I1217 20:26:31.740179  414292 type.go:168] "Request Body" body=""
	I1217 20:26:31.740287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:31.740647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:32.240201  414292 type.go:168] "Request Body" body=""
	I1217 20:26:32.240304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:32.240630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:32.740084  414292 type.go:168] "Request Body" body=""
	I1217 20:26:32.740156  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:32.740461  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:32.740520  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:33.240200  414292 type.go:168] "Request Body" body=""
	I1217 20:26:33.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:33.240625  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:33.740223  414292 type.go:168] "Request Body" body=""
	I1217 20:26:33.740319  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:33.740635  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:34.240634  414292 type.go:168] "Request Body" body=""
	I1217 20:26:34.240711  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:34.241019  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:34.740715  414292 type.go:168] "Request Body" body=""
	I1217 20:26:34.740788  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:34.741122  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:34.741178  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:35.240958  414292 type.go:168] "Request Body" body=""
	I1217 20:26:35.241039  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:35.241368  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:35.740050  414292 type.go:168] "Request Body" body=""
	I1217 20:26:35.740126  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:35.740407  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:36.240170  414292 type.go:168] "Request Body" body=""
	I1217 20:26:36.240271  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:36.240623  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:36.740177  414292 type.go:168] "Request Body" body=""
	I1217 20:26:36.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:36.740609  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:37.240114  414292 type.go:168] "Request Body" body=""
	I1217 20:26:37.240213  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:37.240481  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:37.240522  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:37.740165  414292 type.go:168] "Request Body" body=""
	I1217 20:26:37.740239  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:37.740593  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:38.240448  414292 type.go:168] "Request Body" body=""
	I1217 20:26:38.240537  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:38.240850  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:38.740388  414292 type.go:168] "Request Body" body=""
	I1217 20:26:38.740461  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:38.740786  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:39.240612  414292 type.go:168] "Request Body" body=""
	I1217 20:26:39.240699  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:39.241070  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:39.241129  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:39.740905  414292 type.go:168] "Request Body" body=""
	I1217 20:26:39.740985  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:39.741321  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:40.240050  414292 type.go:168] "Request Body" body=""
	I1217 20:26:40.240123  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:40.240466  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:40.740172  414292 type.go:168] "Request Body" body=""
	I1217 20:26:40.740275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:40.740632  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:41.240336  414292 type.go:168] "Request Body" body=""
	I1217 20:26:41.240409  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:41.240744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:41.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:26:41.740423  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:41.740730  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:41.740787  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:42.240202  414292 type.go:168] "Request Body" body=""
	I1217 20:26:42.240311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:42.240673  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:42.740189  414292 type.go:168] "Request Body" body=""
	I1217 20:26:42.740284  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:42.740581  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:43.240122  414292 type.go:168] "Request Body" body=""
	I1217 20:26:43.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:43.240458  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:43.740180  414292 type.go:168] "Request Body" body=""
	I1217 20:26:43.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:43.740597  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:44.240188  414292 type.go:168] "Request Body" body=""
	I1217 20:26:44.240284  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:44.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:44.240672  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:44.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:26:44.740427  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:44.740701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:45.240268  414292 type.go:168] "Request Body" body=""
	I1217 20:26:45.240370  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:45.240810  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:45.740385  414292 type.go:168] "Request Body" body=""
	I1217 20:26:45.740481  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:45.740887  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:46.240669  414292 type.go:168] "Request Body" body=""
	I1217 20:26:46.240750  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:46.241005  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:46.241046  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:46.740832  414292 type.go:168] "Request Body" body=""
	I1217 20:26:46.740907  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:46.741230  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:47.241112  414292 type.go:168] "Request Body" body=""
	I1217 20:26:47.241195  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:47.241535  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:47.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:47.740311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:47.740564  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:48.240565  414292 type.go:168] "Request Body" body=""
	I1217 20:26:48.240648  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:48.240997  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:48.740812  414292 type.go:168] "Request Body" body=""
	I1217 20:26:48.740893  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:48.741250  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:48.741305  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:49.241092  414292 type.go:168] "Request Body" body=""
	I1217 20:26:49.241159  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:49.241410  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:49.740094  414292 type.go:168] "Request Body" body=""
	I1217 20:26:49.740170  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:49.740483  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:50.240219  414292 type.go:168] "Request Body" body=""
	I1217 20:26:50.240334  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:50.240696  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:50.740130  414292 type.go:168] "Request Body" body=""
	I1217 20:26:50.740210  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:50.740538  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:51.240180  414292 type.go:168] "Request Body" body=""
	I1217 20:26:51.240279  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:51.240607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:51.240658  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:51.740209  414292 type.go:168] "Request Body" body=""
	I1217 20:26:51.740323  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:51.740662  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:52.240355  414292 type.go:168] "Request Body" body=""
	I1217 20:26:52.240429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:52.240693  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:52.740381  414292 type.go:168] "Request Body" body=""
	I1217 20:26:52.740464  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:52.740824  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:53.240545  414292 type.go:168] "Request Body" body=""
	I1217 20:26:53.240622  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:53.240967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:53.241022  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:53.740774  414292 type.go:168] "Request Body" body=""
	I1217 20:26:53.740855  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:53.741192  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:54.240965  414292 type.go:168] "Request Body" body=""
	I1217 20:26:54.241045  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:54.241396  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:54.740180  414292 type.go:168] "Request Body" body=""
	I1217 20:26:54.740269  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:54.740570  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:55.240136  414292 type.go:168] "Request Body" body=""
	I1217 20:26:55.240208  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:55.240531  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:55.740216  414292 type.go:168] "Request Body" body=""
	I1217 20:26:55.740305  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:55.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:55.740689  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:56.240227  414292 type.go:168] "Request Body" body=""
	I1217 20:26:56.240326  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:56.240664  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:56.740128  414292 type.go:168] "Request Body" body=""
	I1217 20:26:56.740207  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:56.740534  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:57.031083  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:26:57.091368  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:26:57.091412  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:57.091434  414292 retry.go:31] will retry after 46.71155063s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:26:57.240719  414292 type.go:168] "Request Body" body=""
	I1217 20:26:57.240799  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:57.241113  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:57.740782  414292 type.go:168] "Request Body" body=""
	I1217 20:26:57.740862  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:57.741143  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:57.741191  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:26:58.240610  414292 type.go:168] "Request Body" body=""
	I1217 20:26:58.240678  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:58.240925  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:58.740701  414292 type.go:168] "Request Body" body=""
	I1217 20:26:58.740774  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:58.741126  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:59.241090  414292 type.go:168] "Request Body" body=""
	I1217 20:26:59.241163  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:59.241466  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:26:59.740862  414292 type.go:168] "Request Body" body=""
	I1217 20:26:59.740930  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:26:59.741174  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:26:59.741215  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:00.241177  414292 type.go:168] "Request Body" body=""
	I1217 20:27:00.241266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:00.241643  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:00.740463  414292 type.go:168] "Request Body" body=""
	I1217 20:27:00.740543  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:00.740888  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:01.240688  414292 type.go:168] "Request Body" body=""
	I1217 20:27:01.240764  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:01.241063  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:01.740881  414292 type.go:168] "Request Body" body=""
	I1217 20:27:01.740989  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:01.741337  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:01.741388  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:02.240100  414292 type.go:168] "Request Body" body=""
	I1217 20:27:02.240176  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:02.240556  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:02.557038  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:27:02.616976  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:02.620493  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:27:02.620531  414292 retry.go:31] will retry after 42.622456402s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1217 20:27:02.740802  414292 type.go:168] "Request Body" body=""
	I1217 20:27:02.740875  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:02.741140  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:03.240977  414292 type.go:168] "Request Body" body=""
	I1217 20:27:03.241074  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:03.241392  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:03.740139  414292 type.go:168] "Request Body" body=""
	I1217 20:27:03.740236  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:03.740586  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:04.240156  414292 type.go:168] "Request Body" body=""
	I1217 20:27:04.240238  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:04.240536  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:04.240579  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:04.740272  414292 type.go:168] "Request Body" body=""
	I1217 20:27:04.740346  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:04.740738  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:05.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:27:05.240287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:05.240617  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:05.740277  414292 type.go:168] "Request Body" body=""
	I1217 20:27:05.740351  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:05.740613  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:06.240183  414292 type.go:168] "Request Body" body=""
	I1217 20:27:06.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:06.240615  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:06.240675  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:06.740224  414292 type.go:168] "Request Body" body=""
	I1217 20:27:06.740355  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:06.740779  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:07.240397  414292 type.go:168] "Request Body" body=""
	I1217 20:27:07.240474  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:07.240744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:07.740218  414292 type.go:168] "Request Body" body=""
	I1217 20:27:07.740311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:07.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:08.240435  414292 type.go:168] "Request Body" body=""
	I1217 20:27:08.240511  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:08.240866  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:08.240920  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:08.740675  414292 type.go:168] "Request Body" body=""
	I1217 20:27:08.740748  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:08.741014  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:09.241042  414292 type.go:168] "Request Body" body=""
	I1217 20:27:09.241128  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:09.241481  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:09.740097  414292 type.go:168] "Request Body" body=""
	I1217 20:27:09.740192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:09.740525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:10.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:27:10.240295  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:10.240559  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:10.740356  414292 type.go:168] "Request Body" body=""
	I1217 20:27:10.740433  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:10.740782  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:10.740855  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:11.240590  414292 type.go:168] "Request Body" body=""
	I1217 20:27:11.240674  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:11.241071  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:11.740833  414292 type.go:168] "Request Body" body=""
	I1217 20:27:11.740915  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:11.741195  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:12.241068  414292 type.go:168] "Request Body" body=""
	I1217 20:27:12.241147  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:12.241476  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:12.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:27:12.740301  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:12.740663  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:13.240124  414292 type.go:168] "Request Body" body=""
	I1217 20:27:13.240197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:13.240538  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:13.240601  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:13.740193  414292 type.go:168] "Request Body" body=""
	I1217 20:27:13.740291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:13.740596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:14.240723  414292 type.go:168] "Request Body" body=""
	I1217 20:27:14.240797  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:14.241150  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:14.740979  414292 type.go:168] "Request Body" body=""
	I1217 20:27:14.741059  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:14.741325  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:15.240063  414292 type.go:168] "Request Body" body=""
	I1217 20:27:15.240146  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:15.240479  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:15.740243  414292 type.go:168] "Request Body" body=""
	I1217 20:27:15.740346  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:15.740681  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:15.740748  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:16.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:27:16.240421  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:16.240686  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:16.740158  414292 type.go:168] "Request Body" body=""
	I1217 20:27:16.740236  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:16.740588  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:17.240299  414292 type.go:168] "Request Body" body=""
	I1217 20:27:17.240374  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:17.240705  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:17.740352  414292 type.go:168] "Request Body" body=""
	I1217 20:27:17.740427  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:17.740687  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:18.240626  414292 type.go:168] "Request Body" body=""
	I1217 20:27:18.240717  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:18.241052  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:18.241112  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:18.740882  414292 type.go:168] "Request Body" body=""
	I1217 20:27:18.740963  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:18.741275  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:19.240070  414292 type.go:168] "Request Body" body=""
	I1217 20:27:19.240154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:19.240424  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:19.740179  414292 type.go:168] "Request Body" body=""
	I1217 20:27:19.740319  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:19.740655  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:20.240358  414292 type.go:168] "Request Body" body=""
	I1217 20:27:20.240437  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:20.240772  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:20.740370  414292 type.go:168] "Request Body" body=""
	I1217 20:27:20.740436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:20.740701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:20.740740  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:21.240460  414292 type.go:168] "Request Body" body=""
	I1217 20:27:21.240550  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:21.240867  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:21.740170  414292 type.go:168] "Request Body" body=""
	I1217 20:27:21.740290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:21.740607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:22.240286  414292 type.go:168] "Request Body" body=""
	I1217 20:27:22.240355  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:22.240607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:22.740330  414292 type.go:168] "Request Body" body=""
	I1217 20:27:22.740422  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:22.740746  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:22.740812  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:23.240211  414292 type.go:168] "Request Body" body=""
	I1217 20:27:23.240304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:23.240668  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:23.740210  414292 type.go:168] "Request Body" body=""
	I1217 20:27:23.740305  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:23.740601  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:24.240106  414292 type.go:168] "Request Body" body=""
	I1217 20:27:24.240203  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:24.240577  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:24.740327  414292 type.go:168] "Request Body" body=""
	I1217 20:27:24.740411  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:24.740719  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:25.240377  414292 type.go:168] "Request Body" body=""
	I1217 20:27:25.240459  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:25.240721  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:25.240761  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:25.740193  414292 type.go:168] "Request Body" body=""
	I1217 20:27:25.740294  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:25.740646  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:26.240192  414292 type.go:168] "Request Body" body=""
	I1217 20:27:26.240307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:26.240648  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:26.741005  414292 type.go:168] "Request Body" body=""
	I1217 20:27:26.741084  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:26.741343  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:27.241172  414292 type.go:168] "Request Body" body=""
	I1217 20:27:27.241259  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:27.241612  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:27.241675  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:27.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:27:27.740292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:27.740610  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:28.240549  414292 type.go:168] "Request Body" body=""
	I1217 20:27:28.240616  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:28.240862  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:28.740178  414292 type.go:168] "Request Body" body=""
	I1217 20:27:28.740278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:28.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:29.240202  414292 type.go:168] "Request Body" body=""
	I1217 20:27:29.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:29.240606  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:29.740117  414292 type.go:168] "Request Body" body=""
	I1217 20:27:29.740197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:29.740469  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:29.740520  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:30.240204  414292 type.go:168] "Request Body" body=""
	I1217 20:27:30.240310  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:30.240594  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:30.740299  414292 type.go:168] "Request Body" body=""
	I1217 20:27:30.740381  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:30.740724  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:31.240991  414292 type.go:168] "Request Body" body=""
	I1217 20:27:31.241062  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:31.241311  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:31.741040  414292 type.go:168] "Request Body" body=""
	I1217 20:27:31.741119  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:31.741431  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:31.741486  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:32.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:27:32.240308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:32.240623  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:32.740134  414292 type.go:168] "Request Body" body=""
	I1217 20:27:32.740210  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:32.740528  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:33.240226  414292 type.go:168] "Request Body" body=""
	I1217 20:27:33.240327  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:33.240647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:33.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:27:33.740286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:33.740611  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:34.240530  414292 type.go:168] "Request Body" body=""
	I1217 20:27:34.240599  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:34.240875  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:34.240919  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:34.740753  414292 type.go:168] "Request Body" body=""
	I1217 20:27:34.740829  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:34.741167  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:35.240992  414292 type.go:168] "Request Body" body=""
	I1217 20:27:35.241075  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:35.241368  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:35.740071  414292 type.go:168] "Request Body" body=""
	I1217 20:27:35.740149  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:35.740453  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:36.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:27:36.240282  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:36.240595  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:36.740308  414292 type.go:168] "Request Body" body=""
	I1217 20:27:36.740384  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:36.740734  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:36.740791  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:37.240354  414292 type.go:168] "Request Body" body=""
	I1217 20:27:37.240426  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:37.240679  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:37.740203  414292 type.go:168] "Request Body" body=""
	I1217 20:27:37.740290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:37.740605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:38.240504  414292 type.go:168] "Request Body" body=""
	I1217 20:27:38.240579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:38.240898  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:38.740390  414292 type.go:168] "Request Body" body=""
	I1217 20:27:38.740487  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:38.740891  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:38.740956  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:39.240903  414292 type.go:168] "Request Body" body=""
	I1217 20:27:39.240984  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:39.241256  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:39.741008  414292 type.go:168] "Request Body" body=""
	I1217 20:27:39.741080  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:39.741404  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:40.241008  414292 type.go:168] "Request Body" body=""
	I1217 20:27:40.241078  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:40.241381  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:40.740111  414292 type.go:168] "Request Body" body=""
	I1217 20:27:40.740212  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:40.740522  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:41.240166  414292 type.go:168] "Request Body" body=""
	I1217 20:27:41.240266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:41.240622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:41.240677  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:41.740082  414292 type.go:168] "Request Body" body=""
	I1217 20:27:41.740154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:41.740465  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:42.240219  414292 type.go:168] "Request Body" body=""
	I1217 20:27:42.240342  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:42.240648  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:42.740376  414292 type.go:168] "Request Body" body=""
	I1217 20:27:42.740469  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:42.740797  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:43.240389  414292 type.go:168] "Request Body" body=""
	I1217 20:27:43.240470  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:43.240795  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:43.240842  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:43.740348  414292 type.go:168] "Request Body" body=""
	I1217 20:27:43.740422  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:43.740975  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:43.803299  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1217 20:27:43.859759  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:43.863204  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:43.863297  414292 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1217 20:27:44.241025  414292 type.go:168] "Request Body" body=""
	I1217 20:27:44.241121  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:44.241455  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:44.740148  414292 type.go:168] "Request Body" body=""
	I1217 20:27:44.740221  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:44.740492  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:45.240574  414292 type.go:168] "Request Body" body=""
	I1217 20:27:45.240742  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:45.242019  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=1
	W1217 20:27:45.242186  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:45.244153  414292 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 20:27:45.319007  414292 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:45.319122  414292 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1217 20:27:45.319226  414292 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1217 20:27:45.322350  414292 out.go:179] * Enabled addons: 
	I1217 20:27:45.325857  414292 addons.go:530] duration metric: took 1m57.123269017s for enable addons: enabled=[]
	I1217 20:27:45.740623  414292 type.go:168] "Request Body" body=""
	I1217 20:27:45.740707  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:45.741048  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:46.240887  414292 type.go:168] "Request Body" body=""
	I1217 20:27:46.240956  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:46.241257  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:46.741061  414292 type.go:168] "Request Body" body=""
	I1217 20:27:46.741135  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:46.741496  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:47.240179  414292 type.go:168] "Request Body" body=""
	I1217 20:27:47.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:47.240598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:47.740172  414292 type.go:168] "Request Body" body=""
	I1217 20:27:47.745570  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:47.746779  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:47.746914  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:48.240563  414292 type.go:168] "Request Body" body=""
	I1217 20:27:48.240642  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:48.240990  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:48.740796  414292 type.go:168] "Request Body" body=""
	I1217 20:27:48.740881  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:48.741219  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:49.240326  414292 type.go:168] "Request Body" body=""
	I1217 20:27:49.240395  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:49.240661  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:49.740181  414292 type.go:168] "Request Body" body=""
	I1217 20:27:49.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:49.740594  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:50.240191  414292 type.go:168] "Request Body" body=""
	I1217 20:27:50.240286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:50.240605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:50.240662  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:50.741072  414292 type.go:168] "Request Body" body=""
	I1217 20:27:50.741139  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:50.741398  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:51.240102  414292 type.go:168] "Request Body" body=""
	I1217 20:27:51.240183  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:51.240525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:51.740138  414292 type.go:168] "Request Body" body=""
	I1217 20:27:51.740212  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:51.740573  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:52.240123  414292 type.go:168] "Request Body" body=""
	I1217 20:27:52.240196  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:52.240556  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:52.740167  414292 type.go:168] "Request Body" body=""
	I1217 20:27:52.740295  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:52.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:52.740683  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:53.240363  414292 type.go:168] "Request Body" body=""
	I1217 20:27:53.240445  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:53.240776  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:53.740345  414292 type.go:168] "Request Body" body=""
	I1217 20:27:53.740444  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:53.740732  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:54.240671  414292 type.go:168] "Request Body" body=""
	I1217 20:27:54.240743  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:54.241055  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:54.740834  414292 type.go:168] "Request Body" body=""
	I1217 20:27:54.740911  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:54.741241  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:54.741301  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:55.241020  414292 type.go:168] "Request Body" body=""
	I1217 20:27:55.241088  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:55.241340  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:55.740115  414292 type.go:168] "Request Body" body=""
	I1217 20:27:55.740205  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:55.740573  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:56.240196  414292 type.go:168] "Request Body" body=""
	I1217 20:27:56.240298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:56.240618  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:56.740302  414292 type.go:168] "Request Body" body=""
	I1217 20:27:56.740375  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:56.740674  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:57.240163  414292 type.go:168] "Request Body" body=""
	I1217 20:27:57.240242  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:57.240572  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:57.240625  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:57.740326  414292 type.go:168] "Request Body" body=""
	I1217 20:27:57.740414  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:57.740758  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:58.240606  414292 type.go:168] "Request Body" body=""
	I1217 20:27:58.240676  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:58.240930  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:58.740703  414292 type.go:168] "Request Body" body=""
	I1217 20:27:58.740776  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:58.741128  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:27:59.240930  414292 type.go:168] "Request Body" body=""
	I1217 20:27:59.241008  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:59.241330  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:27:59.241390  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:27:59.741097  414292 type.go:168] "Request Body" body=""
	I1217 20:27:59.741170  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:27:59.741444  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:00.240277  414292 type.go:168] "Request Body" body=""
	I1217 20:28:00.240360  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:00.240691  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:00.740372  414292 type.go:168] "Request Body" body=""
	I1217 20:28:00.740453  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:00.740814  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:01.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:28:01.240448  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:01.240754  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:01.740452  414292 type.go:168] "Request Body" body=""
	I1217 20:28:01.740539  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:01.740931  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:01.740984  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:02.240802  414292 type.go:168] "Request Body" body=""
	I1217 20:28:02.240878  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:02.241186  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:02.740889  414292 type.go:168] "Request Body" body=""
	I1217 20:28:02.740958  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:02.741285  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:03.241130  414292 type.go:168] "Request Body" body=""
	I1217 20:28:03.241210  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:03.241568  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:03.740205  414292 type.go:168] "Request Body" body=""
	I1217 20:28:03.740305  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:03.740675  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:04.240435  414292 type.go:168] "Request Body" body=""
	I1217 20:28:04.240501  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:04.240759  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:04.240799  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:04.740206  414292 type.go:168] "Request Body" body=""
	I1217 20:28:04.740298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:04.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:05.240365  414292 type.go:168] "Request Body" body=""
	I1217 20:28:05.240442  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:05.240770  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:05.740360  414292 type.go:168] "Request Body" body=""
	I1217 20:28:05.740435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:05.740725  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:06.240330  414292 type.go:168] "Request Body" body=""
	I1217 20:28:06.240409  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:06.240732  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:06.740442  414292 type.go:168] "Request Body" body=""
	I1217 20:28:06.740525  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:06.740853  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:06.740914  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:07.240349  414292 type.go:168] "Request Body" body=""
	I1217 20:28:07.240423  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:07.240678  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:07.740162  414292 type.go:168] "Request Body" body=""
	I1217 20:28:07.740243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:07.740585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:08.240506  414292 type.go:168] "Request Body" body=""
	I1217 20:28:08.240587  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:08.240906  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:08.740708  414292 type.go:168] "Request Body" body=""
	I1217 20:28:08.740811  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:08.741159  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:08.741223  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:09.241091  414292 type.go:168] "Request Body" body=""
	I1217 20:28:09.241169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:09.241495  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:09.740346  414292 type.go:168] "Request Body" body=""
	I1217 20:28:09.740424  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:09.740766  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:10.240377  414292 type.go:168] "Request Body" body=""
	I1217 20:28:10.240453  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:10.240750  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:10.740215  414292 type.go:168] "Request Body" body=""
	I1217 20:28:10.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:10.740678  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:11.240243  414292 type.go:168] "Request Body" body=""
	I1217 20:28:11.240342  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:11.240712  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:11.240767  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:11.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:28:11.740446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:11.740716  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:12.240420  414292 type.go:168] "Request Body" body=""
	I1217 20:28:12.240502  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:12.240833  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:12.740560  414292 type.go:168] "Request Body" body=""
	I1217 20:28:12.740634  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:12.740952  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:13.240720  414292 type.go:168] "Request Body" body=""
	I1217 20:28:13.240805  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:13.241066  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:13.241122  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:13.740900  414292 type.go:168] "Request Body" body=""
	I1217 20:28:13.740983  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:13.741356  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:14.240146  414292 type.go:168] "Request Body" body=""
	I1217 20:28:14.240277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:14.240625  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:14.740140  414292 type.go:168] "Request Body" body=""
	I1217 20:28:14.740209  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:14.740516  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:15.240203  414292 type.go:168] "Request Body" body=""
	I1217 20:28:15.240304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:15.240633  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:15.740361  414292 type.go:168] "Request Body" body=""
	I1217 20:28:15.740447  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:15.740780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:15.740840  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:16.240359  414292 type.go:168] "Request Body" body=""
	I1217 20:28:16.240436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:16.240823  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:16.740505  414292 type.go:168] "Request Body" body=""
	I1217 20:28:16.740583  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:16.740911  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:17.240693  414292 type.go:168] "Request Body" body=""
	I1217 20:28:17.240772  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:17.241095  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:17.740839  414292 type.go:168] "Request Body" body=""
	I1217 20:28:17.740925  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:17.741192  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:17.741241  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:18.241106  414292 type.go:168] "Request Body" body=""
	I1217 20:28:18.241186  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:18.241520  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:18.740223  414292 type.go:168] "Request Body" body=""
	I1217 20:28:18.740344  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:18.740693  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:19.240508  414292 type.go:168] "Request Body" body=""
	I1217 20:28:19.240581  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:19.240841  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:19.740191  414292 type.go:168] "Request Body" body=""
	I1217 20:28:19.740291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:19.740610  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:20.240363  414292 type.go:168] "Request Body" body=""
	I1217 20:28:20.240438  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:20.240765  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:20.240823  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:20.740348  414292 type.go:168] "Request Body" body=""
	I1217 20:28:20.740426  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:20.740691  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:21.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:28:21.240298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:21.240640  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:21.740375  414292 type.go:168] "Request Body" body=""
	I1217 20:28:21.740465  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:21.740819  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:22.240416  414292 type.go:168] "Request Body" body=""
	I1217 20:28:22.240484  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:22.240741  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:22.740166  414292 type.go:168] "Request Body" body=""
	I1217 20:28:22.740262  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:22.740592  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:22.740654  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:23.240360  414292 type.go:168] "Request Body" body=""
	I1217 20:28:23.240441  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:23.240798  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:23.740367  414292 type.go:168] "Request Body" body=""
	I1217 20:28:23.740434  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:23.740759  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:24.240649  414292 type.go:168] "Request Body" body=""
	I1217 20:28:24.240730  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:24.241071  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:24.740869  414292 type.go:168] "Request Body" body=""
	I1217 20:28:24.740949  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:24.741289  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:24.741351  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:25.241053  414292 type.go:168] "Request Body" body=""
	I1217 20:28:25.241120  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:25.241378  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:25.740052  414292 type.go:168] "Request Body" body=""
	I1217 20:28:25.740126  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:25.740478  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:26.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:28:26.240294  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:26.240602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:26.740121  414292 type.go:168] "Request Body" body=""
	I1217 20:28:26.740197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:26.740507  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:27.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:28:27.240284  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:27.240622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:27.240679  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:27.740368  414292 type.go:168] "Request Body" body=""
	I1217 20:28:27.740447  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:27.740790  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:28.240689  414292 type.go:168] "Request Body" body=""
	I1217 20:28:28.240769  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:28.241066  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:28.740814  414292 type.go:168] "Request Body" body=""
	I1217 20:28:28.740891  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:28.741191  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:29.240973  414292 type.go:168] "Request Body" body=""
	I1217 20:28:29.241045  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:29.241401  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:29.241453  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:29.740076  414292 type.go:168] "Request Body" body=""
	I1217 20:28:29.740154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:29.740474  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:30.240176  414292 type.go:168] "Request Body" body=""
	I1217 20:28:30.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:30.240673  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:30.740407  414292 type.go:168] "Request Body" body=""
	I1217 20:28:30.740484  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:30.740834  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:31.240344  414292 type.go:168] "Request Body" body=""
	I1217 20:28:31.240421  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:31.240680  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:31.740183  414292 type.go:168] "Request Body" body=""
	I1217 20:28:31.740278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:31.740615  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:31.740674  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:32.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:28:32.240291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:32.240620  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:32.740110  414292 type.go:168] "Request Body" body=""
	I1217 20:28:32.740177  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:32.740450  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:33.240131  414292 type.go:168] "Request Body" body=""
	I1217 20:28:33.240205  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:33.240522  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:33.740095  414292 type.go:168] "Request Body" body=""
	I1217 20:28:33.740169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:33.740516  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:34.240083  414292 type.go:168] "Request Body" body=""
	I1217 20:28:34.240169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:34.240471  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:34.240522  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:34.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:28:34.740277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:34.740604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:35.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:28:35.240295  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:35.240596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:35.740083  414292 type.go:168] "Request Body" body=""
	I1217 20:28:35.740163  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:35.740501  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:36.240147  414292 type.go:168] "Request Body" body=""
	I1217 20:28:36.240220  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:36.240565  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:36.240620  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:36.740181  414292 type.go:168] "Request Body" body=""
	I1217 20:28:36.740270  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:36.740569  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:37.240126  414292 type.go:168] "Request Body" body=""
	I1217 20:28:37.240206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:37.240510  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:37.740120  414292 type.go:168] "Request Body" body=""
	I1217 20:28:37.740203  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:37.740533  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:38.240393  414292 type.go:168] "Request Body" body=""
	I1217 20:28:38.240473  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:38.240804  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:38.240859  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:38.740418  414292 type.go:168] "Request Body" body=""
	I1217 20:28:38.740493  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:38.740792  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:39.240679  414292 type.go:168] "Request Body" body=""
	I1217 20:28:39.240778  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:39.241109  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:39.740934  414292 type.go:168] "Request Body" body=""
	I1217 20:28:39.741010  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:39.741373  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:40.240097  414292 type.go:168] "Request Body" body=""
	I1217 20:28:40.240172  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:40.240452  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:40.740149  414292 type.go:168] "Request Body" body=""
	I1217 20:28:40.740221  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:40.740580  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:40.740634  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:41.240170  414292 type.go:168] "Request Body" body=""
	I1217 20:28:41.240273  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:41.240600  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:41.740062  414292 type.go:168] "Request Body" body=""
	I1217 20:28:41.740137  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:41.740430  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:42.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:28:42.240290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:42.240668  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:42.740377  414292 type.go:168] "Request Body" body=""
	I1217 20:28:42.740455  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:42.740779  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:42.740831  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:43.240367  414292 type.go:168] "Request Body" body=""
	I1217 20:28:43.240476  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:43.240764  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:43.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:28:43.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:43.740560  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:44.240459  414292 type.go:168] "Request Body" body=""
	I1217 20:28:44.240534  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:44.240870  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:44.740354  414292 type.go:168] "Request Body" body=""
	I1217 20:28:44.740429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:44.740695  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:45.240439  414292 type.go:168] "Request Body" body=""
	I1217 20:28:45.240568  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:45.241041  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:45.241102  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:45.740871  414292 type.go:168] "Request Body" body=""
	I1217 20:28:45.740956  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:45.741304  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:46.241069  414292 type.go:168] "Request Body" body=""
	I1217 20:28:46.241138  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:46.241455  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:46.740186  414292 type.go:168] "Request Body" body=""
	I1217 20:28:46.740277  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:46.740602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:47.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:28:47.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:47.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:47.740189  414292 type.go:168] "Request Body" body=""
	I1217 20:28:47.740282  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:47.740601  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:47.740657  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:48.240600  414292 type.go:168] "Request Body" body=""
	I1217 20:28:48.240678  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:48.241014  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:48.740811  414292 type.go:168] "Request Body" body=""
	I1217 20:28:48.740888  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:48.741185  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:49.240973  414292 type.go:168] "Request Body" body=""
	I1217 20:28:49.241051  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:49.241327  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:49.741121  414292 type.go:168] "Request Body" body=""
	I1217 20:28:49.741215  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:49.741596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:49.741649  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:50.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:28:50.240243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:50.240599  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:50.740121  414292 type.go:168] "Request Body" body=""
	I1217 20:28:50.740197  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:50.740508  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:51.240167  414292 type.go:168] "Request Body" body=""
	I1217 20:28:51.240239  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:51.240596  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:51.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:28:51.740310  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:51.740686  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:52.240373  414292 type.go:168] "Request Body" body=""
	I1217 20:28:52.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:52.240709  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:52.240761  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:52.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:28:52.740289  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:52.740584  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:53.240185  414292 type.go:168] "Request Body" body=""
	I1217 20:28:53.240278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:53.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:53.740309  414292 type.go:168] "Request Body" body=""
	I1217 20:28:53.740380  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:53.740678  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:54.240655  414292 type.go:168] "Request Body" body=""
	I1217 20:28:54.240729  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:54.241066  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:54.241123  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:54.740804  414292 type.go:168] "Request Body" body=""
	I1217 20:28:54.740895  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:54.741266  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:55.241066  414292 type.go:168] "Request Body" body=""
	I1217 20:28:55.241142  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:55.241410  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:55.740129  414292 type.go:168] "Request Body" body=""
	I1217 20:28:55.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:55.740615  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:56.240178  414292 type.go:168] "Request Body" body=""
	I1217 20:28:56.240274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:56.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:56.740206  414292 type.go:168] "Request Body" body=""
	I1217 20:28:56.740289  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:56.740613  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:56.740664  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:57.240172  414292 type.go:168] "Request Body" body=""
	I1217 20:28:57.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:57.240598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:57.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:28:57.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:57.740644  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:58.240496  414292 type.go:168] "Request Body" body=""
	I1217 20:28:58.240566  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:58.240825  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:58.740558  414292 type.go:168] "Request Body" body=""
	I1217 20:28:58.740638  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:58.740975  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:28:58.741026  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:28:59.240760  414292 type.go:168] "Request Body" body=""
	I1217 20:28:59.240835  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:59.241171  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:28:59.740992  414292 type.go:168] "Request Body" body=""
	I1217 20:28:59.741067  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:28:59.741325  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:00.241179  414292 type.go:168] "Request Body" body=""
	I1217 20:29:00.241274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:00.241594  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:00.740545  414292 type.go:168] "Request Body" body=""
	I1217 20:29:00.740619  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:00.740922  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:01.240654  414292 type.go:168] "Request Body" body=""
	I1217 20:29:01.240729  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:01.241023  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:01.241072  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:01.740787  414292 type.go:168] "Request Body" body=""
	I1217 20:29:01.740865  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:01.741183  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:02.240971  414292 type.go:168] "Request Body" body=""
	I1217 20:29:02.241051  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:02.241393  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:02.740980  414292 type.go:168] "Request Body" body=""
	I1217 20:29:02.741059  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:02.741391  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:03.240071  414292 type.go:168] "Request Body" body=""
	I1217 20:29:03.240147  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:03.240491  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:03.740095  414292 type.go:168] "Request Body" body=""
	I1217 20:29:03.740172  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:03.740538  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:03.740593  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:04.240122  414292 type.go:168] "Request Body" body=""
	I1217 20:29:04.240206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:04.240574  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:04.740282  414292 type.go:168] "Request Body" body=""
	I1217 20:29:04.740362  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:04.740712  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:05.240168  414292 type.go:168] "Request Body" body=""
	I1217 20:29:05.240264  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:05.240599  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:05.740352  414292 type.go:168] "Request Body" body=""
	I1217 20:29:05.740431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:05.740697  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:05.740738  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:06.240370  414292 type.go:168] "Request Body" body=""
	I1217 20:29:06.240445  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:06.240788  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:06.741116  414292 type.go:168] "Request Body" body=""
	I1217 20:29:06.741191  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:06.741539  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:07.240080  414292 type.go:168] "Request Body" body=""
	I1217 20:29:07.240155  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:07.240463  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:07.740201  414292 type.go:168] "Request Body" body=""
	I1217 20:29:07.740303  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:07.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:08.240575  414292 type.go:168] "Request Body" body=""
	I1217 20:29:08.240648  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:08.241002  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:08.241060  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:08.740757  414292 type.go:168] "Request Body" body=""
	I1217 20:29:08.740826  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:08.741089  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:09.241059  414292 type.go:168] "Request Body" body=""
	I1217 20:29:09.241165  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:09.241510  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:09.740220  414292 type.go:168] "Request Body" body=""
	I1217 20:29:09.740308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:09.740656  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:10.240344  414292 type.go:168] "Request Body" body=""
	I1217 20:29:10.240444  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:10.240756  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:10.740205  414292 type.go:168] "Request Body" body=""
	I1217 20:29:10.740293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:10.740629  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:10.740685  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:11.240376  414292 type.go:168] "Request Body" body=""
	I1217 20:29:11.240454  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:11.240798  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:11.740377  414292 type.go:168] "Request Body" body=""
	I1217 20:29:11.740475  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:11.740809  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:12.240548  414292 type.go:168] "Request Body" body=""
	I1217 20:29:12.240621  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:12.240958  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:12.740753  414292 type.go:168] "Request Body" body=""
	I1217 20:29:12.740831  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:12.741131  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:12.741180  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:13.240743  414292 type.go:168] "Request Body" body=""
	I1217 20:29:13.240816  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:13.241082  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:13.740877  414292 type.go:168] "Request Body" body=""
	I1217 20:29:13.740957  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:13.741251  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:14.241067  414292 type.go:168] "Request Body" body=""
	I1217 20:29:14.241141  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:14.241456  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:14.740157  414292 type.go:168] "Request Body" body=""
	I1217 20:29:14.740261  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:14.740657  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:15.240363  414292 type.go:168] "Request Body" body=""
	I1217 20:29:15.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:15.240796  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:15.240849  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:15.740520  414292 type.go:168] "Request Body" body=""
	I1217 20:29:15.740600  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:15.740926  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:16.240670  414292 type.go:168] "Request Body" body=""
	I1217 20:29:16.240741  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:16.241005  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:16.740812  414292 type.go:168] "Request Body" body=""
	I1217 20:29:16.740895  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:16.741250  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:17.241030  414292 type.go:168] "Request Body" body=""
	I1217 20:29:17.241104  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:17.241485  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:17.241544  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:17.741007  414292 type.go:168] "Request Body" body=""
	I1217 20:29:17.741075  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:17.741330  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:18.240351  414292 type.go:168] "Request Body" body=""
	I1217 20:29:18.240431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:18.240777  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:18.740148  414292 type.go:168] "Request Body" body=""
	I1217 20:29:18.740233  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:18.740593  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:19.240118  414292 type.go:168] "Request Body" body=""
	I1217 20:29:19.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:19.240527  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:19.740167  414292 type.go:168] "Request Body" body=""
	I1217 20:29:19.740241  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:19.740598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:19.740652  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:20.240357  414292 type.go:168] "Request Body" body=""
	I1217 20:29:20.240435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:20.240764  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:20.740360  414292 type.go:168] "Request Body" body=""
	I1217 20:29:20.740433  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:20.740702  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:21.240449  414292 type.go:168] "Request Body" body=""
	I1217 20:29:21.240532  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:21.240864  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:21.740699  414292 type.go:168] "Request Body" body=""
	I1217 20:29:21.740783  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:21.741119  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:21.741177  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:22.240878  414292 type.go:168] "Request Body" body=""
	I1217 20:29:22.240947  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:22.241200  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:22.740985  414292 type.go:168] "Request Body" body=""
	I1217 20:29:22.741058  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:22.741409  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:23.240121  414292 type.go:168] "Request Body" body=""
	I1217 20:29:23.240195  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:23.240546  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:23.740154  414292 type.go:168] "Request Body" body=""
	I1217 20:29:23.740239  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:23.740563  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:24.240534  414292 type.go:168] "Request Body" body=""
	I1217 20:29:24.240612  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:24.240947  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:24.241000  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:24.740777  414292 type.go:168] "Request Body" body=""
	I1217 20:29:24.740857  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:24.741204  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:25.240998  414292 type.go:168] "Request Body" body=""
	I1217 20:29:25.241066  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:25.241333  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:25.741123  414292 type.go:168] "Request Body" body=""
	I1217 20:29:25.741202  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:25.741530  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:26.240201  414292 type.go:168] "Request Body" body=""
	I1217 20:29:26.240299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:26.240642  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:26.740361  414292 type.go:168] "Request Body" body=""
	I1217 20:29:26.740432  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:26.740744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:26.740792  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:27.240174  414292 type.go:168] "Request Body" body=""
	I1217 20:29:27.240243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:27.240585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:27.740308  414292 type.go:168] "Request Body" body=""
	I1217 20:29:27.740392  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:27.740767  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:28.240517  414292 type.go:168] "Request Body" body=""
	I1217 20:29:28.240588  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:28.240857  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:28.740679  414292 type.go:168] "Request Body" body=""
	I1217 20:29:28.740752  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:28.741120  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:28.741173  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:29.240914  414292 type.go:168] "Request Body" body=""
	I1217 20:29:29.240995  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:29.241335  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:29.740969  414292 type.go:168] "Request Body" body=""
	I1217 20:29:29.741045  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:29.741327  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:30.240069  414292 type.go:168] "Request Body" body=""
	I1217 20:29:30.240150  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:30.240512  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:30.740197  414292 type.go:168] "Request Body" body=""
	I1217 20:29:30.740296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:30.740646  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:31.240121  414292 type.go:168] "Request Body" body=""
	I1217 20:29:31.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:31.240521  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:31.240571  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:31.740162  414292 type.go:168] "Request Body" body=""
	I1217 20:29:31.740238  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:31.740576  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:32.240165  414292 type.go:168] "Request Body" body=""
	I1217 20:29:32.240262  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:32.240572  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:32.740149  414292 type.go:168] "Request Body" body=""
	I1217 20:29:32.740217  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:32.740555  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:33.240266  414292 type.go:168] "Request Body" body=""
	I1217 20:29:33.240342  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:33.240665  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:33.240725  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:33.740182  414292 type.go:168] "Request Body" body=""
	I1217 20:29:33.740280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:33.740619  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:34.240111  414292 type.go:168] "Request Body" body=""
	I1217 20:29:34.240178  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:34.240456  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:34.740169  414292 type.go:168] "Request Body" body=""
	I1217 20:29:34.740275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:34.740676  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:35.240238  414292 type.go:168] "Request Body" body=""
	I1217 20:29:35.240333  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:35.240684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:35.740366  414292 type.go:168] "Request Body" body=""
	I1217 20:29:35.740431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:35.740683  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:35.740724  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:36.240205  414292 type.go:168] "Request Body" body=""
	I1217 20:29:36.240297  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:36.240641  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:36.740372  414292 type.go:168] "Request Body" body=""
	I1217 20:29:36.740448  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:36.740761  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:37.240379  414292 type.go:168] "Request Body" body=""
	I1217 20:29:37.240448  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:37.240766  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:37.740215  414292 type.go:168] "Request Body" body=""
	I1217 20:29:37.740301  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:37.740614  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:38.240590  414292 type.go:168] "Request Body" body=""
	I1217 20:29:38.240670  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:38.241007  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:38.241051  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:38.740806  414292 type.go:168] "Request Body" body=""
	I1217 20:29:38.740880  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:38.741145  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:39.240077  414292 type.go:168] "Request Body" body=""
	I1217 20:29:39.240158  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:39.240533  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:39.740138  414292 type.go:168] "Request Body" body=""
	I1217 20:29:39.740216  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:39.740575  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:40.240271  414292 type.go:168] "Request Body" body=""
	I1217 20:29:40.240352  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:40.240630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:40.740330  414292 type.go:168] "Request Body" body=""
	I1217 20:29:40.740414  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:40.740751  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:40.740807  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:41.240185  414292 type.go:168] "Request Body" body=""
	I1217 20:29:41.240278  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:41.240603  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:41.740073  414292 type.go:168] "Request Body" body=""
	I1217 20:29:41.740152  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:41.740438  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:42.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:29:42.240310  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:42.240671  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:42.740385  414292 type.go:168] "Request Body" body=""
	I1217 20:29:42.740463  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:42.740790  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:42.740846  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:43.240351  414292 type.go:168] "Request Body" body=""
	I1217 20:29:43.240416  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:43.240662  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:43.740171  414292 type.go:168] "Request Body" body=""
	I1217 20:29:43.740242  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:43.740569  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:44.240470  414292 type.go:168] "Request Body" body=""
	I1217 20:29:44.240555  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:44.241028  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:44.740766  414292 type.go:168] "Request Body" body=""
	I1217 20:29:44.740836  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:44.741107  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:44.741148  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:45.241078  414292 type.go:168] "Request Body" body=""
	I1217 20:29:45.241164  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:45.241604  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:45.740201  414292 type.go:168] "Request Body" body=""
	I1217 20:29:45.740298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:45.740651  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:46.240352  414292 type.go:168] "Request Body" body=""
	I1217 20:29:46.240425  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:46.240697  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:46.740190  414292 type.go:168] "Request Body" body=""
	I1217 20:29:46.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:46.740656  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:47.240376  414292 type.go:168] "Request Body" body=""
	I1217 20:29:47.240462  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:47.240824  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:47.240891  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:47.740486  414292 type.go:168] "Request Body" body=""
	I1217 20:29:47.740555  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:47.740822  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:48.240800  414292 type.go:168] "Request Body" body=""
	I1217 20:29:48.240885  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:48.241255  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:48.741124  414292 type.go:168] "Request Body" body=""
	I1217 20:29:48.741203  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:48.741664  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:49.240445  414292 type.go:168] "Request Body" body=""
	I1217 20:29:49.240522  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:49.240794  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:49.743007  414292 type.go:168] "Request Body" body=""
	I1217 20:29:49.743084  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:49.743421  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:49.743497  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:50.241133  414292 type.go:168] "Request Body" body=""
	I1217 20:29:50.241229  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:50.241648  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:50.740153  414292 type.go:168] "Request Body" body=""
	I1217 20:29:50.740228  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:50.740573  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:51.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:29:51.240301  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:51.240639  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:51.740398  414292 type.go:168] "Request Body" body=""
	I1217 20:29:51.740482  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:51.740812  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:52.240354  414292 type.go:168] "Request Body" body=""
	I1217 20:29:52.240429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:52.240749  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:52.240807  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:52.740175  414292 type.go:168] "Request Body" body=""
	I1217 20:29:52.740270  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:52.740621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:53.240207  414292 type.go:168] "Request Body" body=""
	I1217 20:29:53.240318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:53.240664  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:53.740074  414292 type.go:168] "Request Body" body=""
	I1217 20:29:53.740144  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:53.740420  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:54.240102  414292 type.go:168] "Request Body" body=""
	I1217 20:29:54.240179  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:54.240492  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:54.740171  414292 type.go:168] "Request Body" body=""
	I1217 20:29:54.740266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:54.740580  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:54.740639  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:55.240282  414292 type.go:168] "Request Body" body=""
	I1217 20:29:55.240356  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:55.240697  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:55.740173  414292 type.go:168] "Request Body" body=""
	I1217 20:29:55.740281  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:55.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:56.240062  414292 type.go:168] "Request Body" body=""
	I1217 20:29:56.240140  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:56.240485  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:56.740174  414292 type.go:168] "Request Body" body=""
	I1217 20:29:56.740243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:56.740524  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:57.240191  414292 type.go:168] "Request Body" body=""
	I1217 20:29:57.240286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:57.240658  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:57.240715  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:57.740239  414292 type.go:168] "Request Body" body=""
	I1217 20:29:57.740339  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:57.740672  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:58.240460  414292 type.go:168] "Request Body" body=""
	I1217 20:29:58.240543  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:58.240844  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:58.740210  414292 type.go:168] "Request Body" body=""
	I1217 20:29:58.740304  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:58.740643  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:29:59.240481  414292 type.go:168] "Request Body" body=""
	I1217 20:29:59.240553  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:59.240919  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:29:59.240975  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:29:59.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:29:59.740429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:29:59.740722  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:00.240635  414292 type.go:168] "Request Body" body=""
	I1217 20:30:00.240720  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:00.241049  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:00.741056  414292 type.go:168] "Request Body" body=""
	I1217 20:30:00.741139  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:00.741446  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:01.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:30:01.240243  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:01.240569  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:01.740198  414292 type.go:168] "Request Body" body=""
	I1217 20:30:01.740313  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:01.740647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:01.740706  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:02.240225  414292 type.go:168] "Request Body" body=""
	I1217 20:30:02.240325  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:02.240684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:02.740220  414292 type.go:168] "Request Body" body=""
	I1217 20:30:02.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:02.740599  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:03.240198  414292 type.go:168] "Request Body" body=""
	I1217 20:30:03.240293  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:03.240638  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:03.740375  414292 type.go:168] "Request Body" body=""
	I1217 20:30:03.740447  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:03.740781  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:03.740851  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:04.240827  414292 type.go:168] "Request Body" body=""
	I1217 20:30:04.240905  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:04.241246  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:04.741038  414292 type.go:168] "Request Body" body=""
	I1217 20:30:04.741117  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:04.741482  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:05.240086  414292 type.go:168] "Request Body" body=""
	I1217 20:30:05.240166  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:05.240523  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:05.740218  414292 type.go:168] "Request Body" body=""
	I1217 20:30:05.740313  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:05.740583  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:06.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:30:06.240296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:06.240608  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:06.240657  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:06.740166  414292 type.go:168] "Request Body" body=""
	I1217 20:30:06.740292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:06.740641  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:07.240137  414292 type.go:168] "Request Body" body=""
	I1217 20:30:07.240232  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:07.240542  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:07.740212  414292 type.go:168] "Request Body" body=""
	I1217 20:30:07.740319  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:07.740677  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:08.240572  414292 type.go:168] "Request Body" body=""
	I1217 20:30:08.240703  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:08.241012  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:08.241060  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:08.740470  414292 type.go:168] "Request Body" body=""
	I1217 20:30:08.740547  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:08.740807  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:09.240838  414292 type.go:168] "Request Body" body=""
	I1217 20:30:09.240937  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:09.241278  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:09.741105  414292 type.go:168] "Request Body" body=""
	I1217 20:30:09.741196  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:09.741522  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:10.240162  414292 type.go:168] "Request Body" body=""
	I1217 20:30:10.240237  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:10.240593  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:10.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:30:10.740286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:10.740639  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:10.740692  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:11.240192  414292 type.go:168] "Request Body" body=""
	I1217 20:30:11.240314  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:11.240666  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:11.740282  414292 type.go:168] "Request Body" body=""
	I1217 20:30:11.740357  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:11.740632  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:12.240197  414292 type.go:168] "Request Body" body=""
	I1217 20:30:12.240296  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:12.240602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:12.740302  414292 type.go:168] "Request Body" body=""
	I1217 20:30:12.740382  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:12.740745  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:12.740799  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:13.240268  414292 type.go:168] "Request Body" body=""
	I1217 20:30:13.240338  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:13.240649  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:13.740420  414292 type.go:168] "Request Body" body=""
	I1217 20:30:13.740503  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:13.740905  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:14.240722  414292 type.go:168] "Request Body" body=""
	I1217 20:30:14.240802  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:14.241096  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:14.740830  414292 type.go:168] "Request Body" body=""
	I1217 20:30:14.740904  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:14.741180  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:14.741236  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:15.241024  414292 type.go:168] "Request Body" body=""
	I1217 20:30:15.241104  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:15.241450  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:15.741152  414292 type.go:168] "Request Body" body=""
	I1217 20:30:15.741234  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:15.741523  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:16.240202  414292 type.go:168] "Request Body" body=""
	I1217 20:30:16.240287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:16.240602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:16.740191  414292 type.go:168] "Request Body" body=""
	I1217 20:30:16.740289  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:16.740624  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:17.240184  414292 type.go:168] "Request Body" body=""
	I1217 20:30:17.240275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:17.240605  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:17.240659  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:17.740234  414292 type.go:168] "Request Body" body=""
	I1217 20:30:17.740318  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:17.740651  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:18.240499  414292 type.go:168] "Request Body" body=""
	I1217 20:30:18.240576  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:18.240897  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:18.740209  414292 type.go:168] "Request Body" body=""
	I1217 20:30:18.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:18.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:19.240591  414292 type.go:168] "Request Body" body=""
	I1217 20:30:19.240664  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:19.240919  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:19.240962  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:19.740705  414292 type.go:168] "Request Body" body=""
	I1217 20:30:19.740783  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:19.741126  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:20.240841  414292 type.go:168] "Request Body" body=""
	I1217 20:30:20.240934  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:20.241283  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:20.741051  414292 type.go:168] "Request Body" body=""
	I1217 20:30:20.741126  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:20.741393  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:21.240114  414292 type.go:168] "Request Body" body=""
	I1217 20:30:21.240193  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:21.240534  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:21.740204  414292 type.go:168] "Request Body" body=""
	I1217 20:30:21.740298  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:21.740636  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:21.740690  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:22.240114  414292 type.go:168] "Request Body" body=""
	I1217 20:30:22.240185  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:22.240483  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:22.740163  414292 type.go:168] "Request Body" body=""
	I1217 20:30:22.740266  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:22.740598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:23.240329  414292 type.go:168] "Request Body" body=""
	I1217 20:30:23.240410  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:23.240708  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:23.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:30:23.740421  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:23.740715  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:23.740754  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:24.240749  414292 type.go:168] "Request Body" body=""
	I1217 20:30:24.240827  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:24.241165  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:24.740945  414292 type.go:168] "Request Body" body=""
	I1217 20:30:24.741018  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:24.741341  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:25.241107  414292 type.go:168] "Request Body" body=""
	I1217 20:30:25.241181  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:25.241513  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:25.740206  414292 type.go:168] "Request Body" body=""
	I1217 20:30:25.740297  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:25.740647  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:26.240231  414292 type.go:168] "Request Body" body=""
	I1217 20:30:26.240323  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:26.240661  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:26.240712  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:26.740311  414292 type.go:168] "Request Body" body=""
	I1217 20:30:26.740386  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:26.740706  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:27.240152  414292 type.go:168] "Request Body" body=""
	I1217 20:30:27.240224  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:27.240591  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:27.740314  414292 type.go:168] "Request Body" body=""
	I1217 20:30:27.740399  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:27.740736  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:28.240510  414292 type.go:168] "Request Body" body=""
	I1217 20:30:28.240580  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:28.240845  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:28.240885  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:28.740684  414292 type.go:168] "Request Body" body=""
	I1217 20:30:28.740769  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:28.741122  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:29.240868  414292 type.go:168] "Request Body" body=""
	I1217 20:30:29.240943  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:29.241278  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:29.741010  414292 type.go:168] "Request Body" body=""
	I1217 20:30:29.741088  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:29.741347  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:30.240074  414292 type.go:168] "Request Body" body=""
	I1217 20:30:30.240156  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:30.240536  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:30.740270  414292 type.go:168] "Request Body" body=""
	I1217 20:30:30.740350  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:30.740682  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:30.740742  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:31.240353  414292 type.go:168] "Request Body" body=""
	I1217 20:30:31.240426  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:31.240709  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:31.740175  414292 type.go:168] "Request Body" body=""
	I1217 20:30:31.740274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:31.740607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:32.240196  414292 type.go:168] "Request Body" body=""
	I1217 20:30:32.240290  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:32.240644  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:32.740323  414292 type.go:168] "Request Body" body=""
	I1217 20:30:32.740399  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:32.740721  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:32.740777  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:33.240191  414292 type.go:168] "Request Body" body=""
	I1217 20:30:33.240282  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:33.240581  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:33.740289  414292 type.go:168] "Request Body" body=""
	I1217 20:30:33.740384  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:33.740725  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:34.240516  414292 type.go:168] "Request Body" body=""
	I1217 20:30:34.240601  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:34.240877  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:34.740166  414292 type.go:168] "Request Body" body=""
	I1217 20:30:34.740263  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:34.740598  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:35.240329  414292 type.go:168] "Request Body" body=""
	I1217 20:30:35.240407  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:35.240744  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:35.240807  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:35.740356  414292 type.go:168] "Request Body" body=""
	I1217 20:30:35.740430  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:35.740684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:36.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:30:36.240292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:36.240620  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:36.740204  414292 type.go:168] "Request Body" body=""
	I1217 20:30:36.740299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:36.740621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:37.240083  414292 type.go:168] "Request Body" body=""
	I1217 20:30:37.240154  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:37.240449  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:37.740150  414292 type.go:168] "Request Body" body=""
	I1217 20:30:37.740225  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:37.740559  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:37.740619  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:38.240565  414292 type.go:168] "Request Body" body=""
	I1217 20:30:38.240642  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:38.240952  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:38.740678  414292 type.go:168] "Request Body" body=""
	I1217 20:30:38.740753  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:38.741008  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:39.241038  414292 type.go:168] "Request Body" body=""
	I1217 20:30:39.241115  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:39.241418  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:39.740131  414292 type.go:168] "Request Body" body=""
	I1217 20:30:39.740209  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:39.740536  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:40.240118  414292 type.go:168] "Request Body" body=""
	I1217 20:30:40.240199  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:40.240498  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:40.240543  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:40.740108  414292 type.go:168] "Request Body" body=""
	I1217 20:30:40.740188  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:40.740547  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:41.240309  414292 type.go:168] "Request Body" body=""
	I1217 20:30:41.240393  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:41.240720  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:41.740366  414292 type.go:168] "Request Body" body=""
	I1217 20:30:41.740436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:41.740701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:42.240206  414292 type.go:168] "Request Body" body=""
	I1217 20:30:42.240307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:42.240670  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:42.240729  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:42.740417  414292 type.go:168] "Request Body" body=""
	I1217 20:30:42.740498  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:42.740825  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:43.240360  414292 type.go:168] "Request Body" body=""
	I1217 20:30:43.240441  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:43.240782  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:43.740183  414292 type.go:168] "Request Body" body=""
	I1217 20:30:43.740276  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:43.740607  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:44.240187  414292 type.go:168] "Request Body" body=""
	I1217 20:30:44.240292  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:44.240612  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:44.740076  414292 type.go:168] "Request Body" body=""
	I1217 20:30:44.740144  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:44.740421  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:44.740464  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:45.240192  414292 type.go:168] "Request Body" body=""
	I1217 20:30:45.240361  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:45.240784  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:45.740213  414292 type.go:168] "Request Body" body=""
	I1217 20:30:45.740320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:45.740704  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:46.241036  414292 type.go:168] "Request Body" body=""
	I1217 20:30:46.241111  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:46.241363  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:46.741131  414292 type.go:168] "Request Body" body=""
	I1217 20:30:46.741206  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:46.741497  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:46.741543  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:47.240089  414292 type.go:168] "Request Body" body=""
	I1217 20:30:47.240162  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:47.240507  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:47.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:30:47.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:47.740533  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:48.240579  414292 type.go:168] "Request Body" body=""
	I1217 20:30:48.240662  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:48.240968  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:48.740776  414292 type.go:168] "Request Body" body=""
	I1217 20:30:48.740848  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:48.741156  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:49.241082  414292 type.go:168] "Request Body" body=""
	I1217 20:30:49.241169  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:49.241428  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:49.241479  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:49.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:30:49.740280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:49.740617  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:50.240370  414292 type.go:168] "Request Body" body=""
	I1217 20:30:50.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:50.240786  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:50.740363  414292 type.go:168] "Request Body" body=""
	I1217 20:30:50.740431  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:50.740703  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:51.240208  414292 type.go:168] "Request Body" body=""
	I1217 20:30:51.240317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:51.240620  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:51.740346  414292 type.go:168] "Request Body" body=""
	I1217 20:30:51.740425  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:51.740761  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:51.740821  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:52.240214  414292 type.go:168] "Request Body" body=""
	I1217 20:30:52.240300  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:52.240621  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:52.740220  414292 type.go:168] "Request Body" body=""
	I1217 20:30:52.740317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:52.740663  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:53.240242  414292 type.go:168] "Request Body" body=""
	I1217 20:30:53.240341  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:53.240677  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:53.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:30:53.740434  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:53.740762  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:54.240735  414292 type.go:168] "Request Body" body=""
	I1217 20:30:54.240807  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:54.241141  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:54.241194  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:54.740836  414292 type.go:168] "Request Body" body=""
	I1217 20:30:54.740927  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:54.741298  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:55.241043  414292 type.go:168] "Request Body" body=""
	I1217 20:30:55.241118  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:55.241396  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:55.740103  414292 type.go:168] "Request Body" body=""
	I1217 20:30:55.740188  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:55.740556  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:56.240179  414292 type.go:168] "Request Body" body=""
	I1217 20:30:56.240272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:56.240684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:56.740352  414292 type.go:168] "Request Body" body=""
	I1217 20:30:56.740424  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:56.740685  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:56.740726  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:57.240185  414292 type.go:168] "Request Body" body=""
	I1217 20:30:57.240275  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:57.240613  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:57.740358  414292 type.go:168] "Request Body" body=""
	I1217 20:30:57.740443  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:57.740790  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:58.240510  414292 type.go:168] "Request Body" body=""
	I1217 20:30:58.240581  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:58.240843  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:58.740171  414292 type.go:168] "Request Body" body=""
	I1217 20:30:58.740269  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:58.740606  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:30:59.240200  414292 type.go:168] "Request Body" body=""
	I1217 20:30:59.240287  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:59.240645  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:30:59.240707  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:30:59.740354  414292 type.go:168] "Request Body" body=""
	I1217 20:30:59.740428  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:30:59.740694  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:00.240295  414292 type.go:168] "Request Body" body=""
	I1217 20:31:00.240376  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:00.240702  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:00.740561  414292 type.go:168] "Request Body" body=""
	I1217 20:31:00.740646  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:00.741014  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:01.240812  414292 type.go:168] "Request Body" body=""
	I1217 20:31:01.240892  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:01.241169  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:01.241213  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:01.740960  414292 type.go:168] "Request Body" body=""
	I1217 20:31:01.741043  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:01.741371  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:02.240103  414292 type.go:168] "Request Body" body=""
	I1217 20:31:02.240187  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:02.240566  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:02.740227  414292 type.go:168] "Request Body" body=""
	I1217 20:31:02.740315  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:02.740637  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:03.240193  414292 type.go:168] "Request Body" body=""
	I1217 20:31:03.240291  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:03.240590  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:03.740279  414292 type.go:168] "Request Body" body=""
	I1217 20:31:03.740349  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:03.740684  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:03.740743  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:04.240506  414292 type.go:168] "Request Body" body=""
	I1217 20:31:04.240579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:04.240829  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:04.740200  414292 type.go:168] "Request Body" body=""
	I1217 20:31:04.740299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:04.740630  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:05.240188  414292 type.go:168] "Request Body" body=""
	I1217 20:31:05.240285  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:05.240600  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:05.740119  414292 type.go:168] "Request Body" body=""
	I1217 20:31:05.740198  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:05.740527  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:06.240220  414292 type.go:168] "Request Body" body=""
	I1217 20:31:06.240320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:06.240652  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:06.240704  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:06.740395  414292 type.go:168] "Request Body" body=""
	I1217 20:31:06.740474  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:06.740826  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:07.240362  414292 type.go:168] "Request Body" body=""
	I1217 20:31:07.240437  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:07.240699  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:07.740382  414292 type.go:168] "Request Body" body=""
	I1217 20:31:07.740456  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:07.740780  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:08.240694  414292 type.go:168] "Request Body" body=""
	I1217 20:31:08.240775  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:08.241125  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:08.241178  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:08.740933  414292 type.go:168] "Request Body" body=""
	I1217 20:31:08.741009  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:08.741272  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:09.240107  414292 type.go:168] "Request Body" body=""
	I1217 20:31:09.240192  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:09.240509  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:09.740211  414292 type.go:168] "Request Body" body=""
	I1217 20:31:09.740317  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:09.740653  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:10.240147  414292 type.go:168] "Request Body" body=""
	I1217 20:31:10.240221  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:10.240567  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:10.740283  414292 type.go:168] "Request Body" body=""
	I1217 20:31:10.740362  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:10.740720  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:10.740781  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:11.240308  414292 type.go:168] "Request Body" body=""
	I1217 20:31:11.240387  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:11.240742  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:11.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:31:11.740427  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:11.740686  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:12.240168  414292 type.go:168] "Request Body" body=""
	I1217 20:31:12.240265  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:12.240582  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:12.740305  414292 type.go:168] "Request Body" body=""
	I1217 20:31:12.740382  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:12.740717  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:13.240360  414292 type.go:168] "Request Body" body=""
	I1217 20:31:13.240435  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:13.240753  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:13.240804  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:13.740494  414292 type.go:168] "Request Body" body=""
	I1217 20:31:13.740566  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:13.740865  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:14.240695  414292 type.go:168] "Request Body" body=""
	I1217 20:31:14.240775  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:14.241120  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:14.740883  414292 type.go:168] "Request Body" body=""
	I1217 20:31:14.740949  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:14.741207  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:15.241001  414292 type.go:168] "Request Body" body=""
	I1217 20:31:15.241086  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:15.241424  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:15.241480  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:15.740181  414292 type.go:168] "Request Body" body=""
	I1217 20:31:15.740286  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:15.740606  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:16.240120  414292 type.go:168] "Request Body" body=""
	I1217 20:31:16.240190  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:16.240504  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:16.740165  414292 type.go:168] "Request Body" body=""
	I1217 20:31:16.740263  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:16.740588  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:17.240189  414292 type.go:168] "Request Body" body=""
	I1217 20:31:17.240280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:17.240612  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:17.740176  414292 type.go:168] "Request Body" body=""
	I1217 20:31:17.740245  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:17.740595  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:17.740646  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:18.240578  414292 type.go:168] "Request Body" body=""
	I1217 20:31:18.240656  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:18.241010  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:18.740197  414292 type.go:168] "Request Body" body=""
	I1217 20:31:18.740299  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:18.740622  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:19.240589  414292 type.go:168] "Request Body" body=""
	I1217 20:31:19.240665  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:19.240973  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:19.740527  414292 type.go:168] "Request Body" body=""
	I1217 20:31:19.740602  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:19.740938  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:19.741004  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:20.240746  414292 type.go:168] "Request Body" body=""
	I1217 20:31:20.240826  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:20.241171  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:20.740847  414292 type.go:168] "Request Body" body=""
	I1217 20:31:20.740927  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:20.741205  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:21.240999  414292 type.go:168] "Request Body" body=""
	I1217 20:31:21.241077  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:21.241433  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:21.741076  414292 type.go:168] "Request Body" body=""
	I1217 20:31:21.741157  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:21.741476  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:21.741535  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:22.240148  414292 type.go:168] "Request Body" body=""
	I1217 20:31:22.240225  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:22.240595  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:22.740191  414292 type.go:168] "Request Body" body=""
	I1217 20:31:22.740288  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:22.740671  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:23.240361  414292 type.go:168] "Request Body" body=""
	I1217 20:31:23.240439  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:23.240766  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:23.740367  414292 type.go:168] "Request Body" body=""
	I1217 20:31:23.740436  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:23.740727  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:24.240746  414292 type.go:168] "Request Body" body=""
	I1217 20:31:24.240834  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:24.241219  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:24.241275  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:24.740840  414292 type.go:168] "Request Body" body=""
	I1217 20:31:24.740915  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:24.741224  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:25.240994  414292 type.go:168] "Request Body" body=""
	I1217 20:31:25.241066  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:25.241325  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:25.741146  414292 type.go:168] "Request Body" body=""
	I1217 20:31:25.741238  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:25.741600  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:26.240173  414292 type.go:168] "Request Body" body=""
	I1217 20:31:26.240274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:26.240566  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:26.740222  414292 type.go:168] "Request Body" body=""
	I1217 20:31:26.740302  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:26.740563  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:26.740604  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:27.240174  414292 type.go:168] "Request Body" body=""
	I1217 20:31:27.240265  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:27.240586  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:27.740174  414292 type.go:168] "Request Body" body=""
	I1217 20:31:27.740267  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:27.740588  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:28.240508  414292 type.go:168] "Request Body" body=""
	I1217 20:31:28.240579  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:28.240847  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:28.740580  414292 type.go:168] "Request Body" body=""
	I1217 20:31:28.740654  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:28.740974  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:28.741030  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:29.240927  414292 type.go:168] "Request Body" body=""
	I1217 20:31:29.241003  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:29.241345  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:29.740932  414292 type.go:168] "Request Body" body=""
	I1217 20:31:29.741003  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:29.741297  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:30.240066  414292 type.go:168] "Request Body" body=""
	I1217 20:31:30.240144  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:30.240477  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:30.740196  414292 type.go:168] "Request Body" body=""
	I1217 20:31:30.740297  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:30.740655  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:31.240152  414292 type.go:168] "Request Body" body=""
	I1217 20:31:31.240227  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:31.240525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:31.240572  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:31.740177  414292 type.go:168] "Request Body" body=""
	I1217 20:31:31.740274  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:31.740631  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:32.240364  414292 type.go:168] "Request Body" body=""
	I1217 20:31:32.240441  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:32.240793  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:32.740353  414292 type.go:168] "Request Body" body=""
	I1217 20:31:32.740429  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:32.740739  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:33.240174  414292 type.go:168] "Request Body" body=""
	I1217 20:31:33.240265  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:33.240586  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:33.240635  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:33.740239  414292 type.go:168] "Request Body" body=""
	I1217 20:31:33.740336  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:33.740654  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:34.240597  414292 type.go:168] "Request Body" body=""
	I1217 20:31:34.240677  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:34.240945  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:34.740715  414292 type.go:168] "Request Body" body=""
	I1217 20:31:34.740794  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:34.741113  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:35.240931  414292 type.go:168] "Request Body" body=""
	I1217 20:31:35.241005  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:35.241378  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:35.241431  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:35.740086  414292 type.go:168] "Request Body" body=""
	I1217 20:31:35.740156  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:35.740458  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:36.240166  414292 type.go:168] "Request Body" body=""
	I1217 20:31:36.240268  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:36.240589  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:36.740185  414292 type.go:168] "Request Body" body=""
	I1217 20:31:36.740283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:36.740585  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:37.240233  414292 type.go:168] "Request Body" body=""
	I1217 20:31:37.240320  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:37.240565  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:37.740177  414292 type.go:168] "Request Body" body=""
	I1217 20:31:37.740273  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:37.740567  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:37.740616  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:38.240625  414292 type.go:168] "Request Body" body=""
	I1217 20:31:38.240697  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:38.241070  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:38.740867  414292 type.go:168] "Request Body" body=""
	I1217 20:31:38.740936  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:38.741194  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:39.240063  414292 type.go:168] "Request Body" body=""
	I1217 20:31:39.240204  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:39.240542  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:39.740275  414292 type.go:168] "Request Body" body=""
	I1217 20:31:39.740351  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:39.740669  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:39.740728  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:40.240374  414292 type.go:168] "Request Body" body=""
	I1217 20:31:40.240446  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:40.240701  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:40.740211  414292 type.go:168] "Request Body" body=""
	I1217 20:31:40.740308  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:40.740679  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:41.240409  414292 type.go:168] "Request Body" body=""
	I1217 20:31:41.240499  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:41.240858  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:41.740355  414292 type.go:168] "Request Body" body=""
	I1217 20:31:41.740455  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:41.740717  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:41.740768  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:42.240201  414292 type.go:168] "Request Body" body=""
	I1217 20:31:42.240311  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:42.240703  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:42.740571  414292 type.go:168] "Request Body" body=""
	I1217 20:31:42.740645  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:42.740967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:43.240727  414292 type.go:168] "Request Body" body=""
	I1217 20:31:43.240796  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:43.241050  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:43.740827  414292 type.go:168] "Request Body" body=""
	I1217 20:31:43.740901  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:43.741236  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:43.741293  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:44.241091  414292 type.go:168] "Request Body" body=""
	I1217 20:31:44.241176  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:44.241525  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:44.740194  414292 type.go:168] "Request Body" body=""
	I1217 20:31:44.740280  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:44.745967  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=5
	I1217 20:31:45.240798  414292 type.go:168] "Request Body" body=""
	I1217 20:31:45.240901  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:45.241310  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:45.741143  414292 type.go:168] "Request Body" body=""
	I1217 20:31:45.741226  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:45.741583  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:45.741646  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:46.241073  414292 type.go:168] "Request Body" body=""
	I1217 20:31:46.241146  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:46.241399  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:46.740173  414292 type.go:168] "Request Body" body=""
	I1217 20:31:46.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:46.740602  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:47.240190  414292 type.go:168] "Request Body" body=""
	I1217 20:31:47.240283  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:47.240589  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:47.740187  414292 type.go:168] "Request Body" body=""
	I1217 20:31:47.740307  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:47.740649  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:48.240470  414292 type.go:168] "Request Body" body=""
	I1217 20:31:48.240554  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:48.241013  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1217 20:31:48.241064  414292 node_ready.go:55] error getting node "functional-682596" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-682596": dial tcp 192.168.49.2:8441: connect: connection refused
	I1217 20:31:48.740173  414292 type.go:168] "Request Body" body=""
	I1217 20:31:48.740272  414292 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-682596" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1217 20:31:48.740603  414292 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1217 20:31:49.240608  414292 type.go:168] "Request Body" body=""
	I1217 20:31:49.240675  414292 node_ready.go:38] duration metric: took 6m0.000721639s for node "functional-682596" to be "Ready" ...
	I1217 20:31:49.243794  414292 out.go:203] 
	W1217 20:31:49.246551  414292 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1217 20:31:49.246575  414292 out.go:285] * 
	W1217 20:31:49.249079  414292 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1217 20:31:49.251429  414292 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 17 20:31:56 functional-682596 containerd[5330]: time="2025-12-17T20:31:56.842985708Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:31:57 functional-682596 containerd[5330]: time="2025-12-17T20:31:57.883089157Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Dec 17 20:31:57 functional-682596 containerd[5330]: time="2025-12-17T20:31:57.885250624Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Dec 17 20:31:57 functional-682596 containerd[5330]: time="2025-12-17T20:31:57.893417768Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:31:57 functional-682596 containerd[5330]: time="2025-12-17T20:31:57.894067556Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:31:58 functional-682596 containerd[5330]: time="2025-12-17T20:31:58.853428363Z" level=info msg="No images store for sha256:f8359c2c10bc3fa09ea92f06d2cc7d3c863814f8c0b38cad60a5f93eb6785f57"
	Dec 17 20:31:58 functional-682596 containerd[5330]: time="2025-12-17T20:31:58.855759350Z" level=info msg="ImageCreate event name:\"docker.io/library/minikube-local-cache-test:functional-682596\""
	Dec 17 20:31:58 functional-682596 containerd[5330]: time="2025-12-17T20:31:58.862693152Z" level=info msg="ImageCreate event name:\"sha256:05258a74f07dd17944d5b57da11e1219f05ceba6a54a10e2544b7da8ff43103b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:31:58 functional-682596 containerd[5330]: time="2025-12-17T20:31:58.863308362Z" level=info msg="ImageUpdate event name:\"docker.io/library/minikube-local-cache-test:functional-682596\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:31:59 functional-682596 containerd[5330]: time="2025-12-17T20:31:59.639741021Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\""
	Dec 17 20:31:59 functional-682596 containerd[5330]: time="2025-12-17T20:31:59.642083864Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:latest\""
	Dec 17 20:31:59 functional-682596 containerd[5330]: time="2025-12-17T20:31:59.644000209Z" level=info msg="ImageDelete event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\""
	Dec 17 20:31:59 functional-682596 containerd[5330]: time="2025-12-17T20:31:59.656111207Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\" returns successfully"
	Dec 17 20:32:00 functional-682596 containerd[5330]: time="2025-12-17T20:32:00.814254431Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\""
	Dec 17 20:32:00 functional-682596 containerd[5330]: time="2025-12-17T20:32:00.816646826Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:3.1\""
	Dec 17 20:32:00 functional-682596 containerd[5330]: time="2025-12-17T20:32:00.819891340Z" level=info msg="ImageDelete event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\""
	Dec 17 20:32:00 functional-682596 containerd[5330]: time="2025-12-17T20:32:00.826461085Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\" returns successfully"
	Dec 17 20:32:00 functional-682596 containerd[5330]: time="2025-12-17T20:32:00.992051432Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Dec 17 20:32:00 functional-682596 containerd[5330]: time="2025-12-17T20:32:00.994318919Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Dec 17 20:32:01 functional-682596 containerd[5330]: time="2025-12-17T20:32:01.001983551Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:32:01 functional-682596 containerd[5330]: time="2025-12-17T20:32:01.002925501Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:32:01 functional-682596 containerd[5330]: time="2025-12-17T20:32:01.129645302Z" level=info msg="No images store for sha256:3ac89611d5efd8eb74174b1f04c33b7e73b651cec35b5498caf0cfdd2efd7d48"
	Dec 17 20:32:01 functional-682596 containerd[5330]: time="2025-12-17T20:32:01.132017331Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.1\""
	Dec 17 20:32:01 functional-682596 containerd[5330]: time="2025-12-17T20:32:01.142971508Z" level=info msg="ImageCreate event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:32:01 functional-682596 containerd[5330]: time="2025-12-17T20:32:01.143462870Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:32:05.280988    9418 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:32:05.281625    9418 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:32:05.283184    9418 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:32:05.283525    9418 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:32:05.285059    9418 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec17 17:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015536] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.514164] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034184] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.806183] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.649674] kauditd_printk_skb: 36 callbacks suppressed
	[Dec17 19:37] hrtimer: interrupt took 15014583 ns
	[Dec17 19:39] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:06] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:17] FS-Cache: Duplicate cookie detected
	[  +0.000767] FS-Cache: O-cookie c=00000031 [p=00000002 fl=222 nc=0 na=1]
	[  +0.001036] FS-Cache: O-cookie d=00000000b1f70094{9P.session} n=000000004124fba5
	[  +0.001177] FS-Cache: O-key=[10] '34323937353834383437'
	[  +0.000816] FS-Cache: N-cookie c=00000032 [p=00000002 fl=2 nc=0 na=1]
	[  +0.001043] FS-Cache: N-cookie d=00000000b1f70094{9P.session} n=000000009cece4cf
	[  +0.001160] FS-Cache: N-key=[10] '34323937353834383437'
	
	
	==> kernel <==
	 20:32:05 up  3:14,  0 user,  load average: 0.50, 0.36, 0.78
	Linux functional-682596 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 17 20:32:02 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:32:02 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 827.
	Dec 17 20:32:02 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:32:02 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:32:02 functional-682596 kubelet[9242]: E1217 20:32:02.815164    9242 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:32:02 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:32:02 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:32:03 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 828.
	Dec 17 20:32:03 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:32:03 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:32:03 functional-682596 kubelet[9291]: E1217 20:32:03.588467    9291 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:32:03 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:32:03 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:32:04 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 829.
	Dec 17 20:32:04 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:32:04 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:32:04 functional-682596 kubelet[9319]: E1217 20:32:04.297962    9319 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:32:04 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:32:04 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:32:04 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 830.
	Dec 17 20:32:04 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:32:04 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:32:05 functional-682596 kubelet[9355]: E1217 20:32:05.053943    9355 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:32:05 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:32:05 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596: exit status 2 (343.177016ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-682596" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly (2.41s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig (736.12s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig
functional_test.go:772: (dbg) Run:  out/minikube-linux-arm64 start -p functional-682596 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E1217 20:34:49.015212  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:36:28.508751  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:37:51.572409  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:39:49.015275  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:41:28.507570  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:772: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-682596 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: exit status 109 (12m14.085235523s)

                                                
                                                
-- stdout --
	* [functional-682596] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21808
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	* Starting "functional-682596" primary control-plane node in "functional-682596" cluster
	* Pulling base image v0.0.48-1765661130-22141 ...
	* Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	  - apiserver.enable-admission-plugins=NamespaceAutoProvision
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Unable to restart control-plane node(s), will reset cluster: <no value>
	! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000220112s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001239508s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001239508s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	* Related issue: https://github.com/kubernetes/minikube/issues/4172

                                                
                                                
** /stderr **
functional_test.go:774: failed to restart minikube. args "out/minikube-linux-arm64 start -p functional-682596 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all": exit status 109
functional_test.go:776: restart took 12m14.086596828s for "functional-682596" cluster.
I1217 20:44:20.436590  369461 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-682596
helpers_test.go:244: (dbg) docker inspect functional-682596:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	        "Created": "2025-12-17T20:17:26.774929696Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 408854,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-17T20:17:26.844564666Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:2a6398fc76fc21dc0a77ac54600c2604c101bff52e66ecf65f88ec0f1a8cff2d",
	        "ResolvConfPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hostname",
	        "HostsPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hosts",
	        "LogPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77-json.log",
	        "Name": "/functional-682596",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-682596:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-682596",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	                "LowerDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268-init/diff:/var/lib/docker/overlay2/83c8e6311894730d80a5439b5d4991744e9cfa6d0015df9caca346d57baf92e8/diff",
	                "MergedDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/merged",
	                "UpperDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/diff",
	                "WorkDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-682596",
	                "Source": "/var/lib/docker/volumes/functional-682596/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-682596",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-682596",
	                "name.minikube.sigs.k8s.io": "functional-682596",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "8e0f8d4915f888f90df7adb000bd0e749885d304e33053e85751193487b627b9",
	            "SandboxKey": "/var/run/docker/netns/8e0f8d4915f8",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33163"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33164"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33167"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33165"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33166"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-682596": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "de:95:c1:d9:d4:32",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "9e66e4dbc8284f728f81715f37c51d8272e96fcac9fb378874c982b3077b6cc2",
	                    "EndpointID": "0db3c56cfb2be75c981ed53adcc07de7cd33db60d51c01b0e875c8d41cf02897",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-682596",
	                        "efc9468a7e55"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596: exit status 2 (324.37222ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                         ARGS                                                                          │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image   │ functional-032730 image ls --format short --alsologtostderr                                                                                           │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image   │ functional-032730 image ls --format json --alsologtostderr                                                                                            │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image   │ functional-032730 image ls --format table --alsologtostderr                                                                                           │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ ssh     │ functional-032730 ssh pgrep buildkitd                                                                                                                 │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │                     │
	│ image   │ functional-032730 image build -t localhost/my-image:functional-032730 testdata/build --alsologtostderr                                                │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image   │ functional-032730 image ls                                                                                                                            │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ delete  │ -p functional-032730                                                                                                                                  │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ start   │ -p functional-682596 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1 │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │                     │
	│ start   │ -p functional-682596 --alsologtostderr -v=8                                                                                                           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:25 UTC │                     │
	│ cache   │ functional-682596 cache add registry.k8s.io/pause:3.1                                                                                                 │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ functional-682596 cache add registry.k8s.io/pause:3.3                                                                                                 │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ functional-682596 cache add registry.k8s.io/pause:latest                                                                                              │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ functional-682596 cache add minikube-local-cache-test:functional-682596                                                                               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ functional-682596 cache delete minikube-local-cache-test:functional-682596                                                                            │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                      │ minikube          │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ list                                                                                                                                                  │ minikube          │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ ssh     │ functional-682596 ssh sudo crictl images                                                                                                              │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ ssh     │ functional-682596 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                    │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ ssh     │ functional-682596 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │                     │
	│ cache   │ functional-682596 cache reload                                                                                                                        │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ ssh     │ functional-682596 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                      │ minikube          │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                   │ minikube          │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ kubectl │ functional-682596 kubectl -- --context functional-682596 get pods                                                                                     │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │                     │
	│ start   │ -p functional-682596 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all                                              │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │                     │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/17 20:32:06
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1217 20:32:06.395598  420062 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:32:06.395704  420062 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:32:06.395708  420062 out.go:374] Setting ErrFile to fd 2...
	I1217 20:32:06.395712  420062 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:32:06.395972  420062 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:32:06.396388  420062 out.go:368] Setting JSON to false
	I1217 20:32:06.397206  420062 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":11672,"bootTime":1765991855,"procs":154,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:32:06.397266  420062 start.go:143] virtualization:  
	I1217 20:32:06.400889  420062 out.go:179] * [functional-682596] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1217 20:32:06.403953  420062 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 20:32:06.404019  420062 notify.go:221] Checking for updates...
	I1217 20:32:06.410244  420062 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:32:06.413231  420062 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:32:06.416152  420062 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:32:06.419145  420062 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 20:32:06.422186  420062 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 20:32:06.425355  420062 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:32:06.425444  420062 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:32:06.459431  420062 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:32:06.459555  420062 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:32:06.531840  420062 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-17 20:32:06.520070933 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:32:06.531937  420062 docker.go:319] overlay module found
	I1217 20:32:06.535075  420062 out.go:179] * Using the docker driver based on existing profile
	I1217 20:32:06.538013  420062 start.go:309] selected driver: docker
	I1217 20:32:06.538025  420062 start.go:927] validating driver "docker" against &{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableC
oreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:32:06.538123  420062 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 20:32:06.538239  420062 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:32:06.599898  420062 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-17 20:32:06.590438982 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:32:06.600362  420062 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1217 20:32:06.600387  420062 cni.go:84] Creating CNI manager for ""
	I1217 20:32:06.600439  420062 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:32:06.600480  420062 start.go:353] cluster config:
	{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCo
reDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:32:06.605529  420062 out.go:179] * Starting "functional-682596" primary control-plane node in "functional-682596" cluster
	I1217 20:32:06.608314  420062 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1217 20:32:06.611190  420062 out.go:179] * Pulling base image v0.0.48-1765661130-22141 ...
	I1217 20:32:06.614228  420062 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:32:06.614282  420062 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1217 20:32:06.614283  420062 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon
	I1217 20:32:06.614291  420062 cache.go:65] Caching tarball of preloaded images
	I1217 20:32:06.614394  420062 preload.go:238] Found /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1217 20:32:06.614404  420062 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1217 20:32:06.614527  420062 profile.go:143] Saving config to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/config.json ...
	I1217 20:32:06.634867  420062 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon, skipping pull
	I1217 20:32:06.634879  420062 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 exists in daemon, skipping load
	I1217 20:32:06.634892  420062 cache.go:243] Successfully downloaded all kic artifacts
	I1217 20:32:06.634927  420062 start.go:360] acquireMachinesLock for functional-682596: {Name:mk49b95a4c72eb2d15a1ae0f35918a9843d0b3df Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1217 20:32:06.634983  420062 start.go:364] duration metric: took 39.828µs to acquireMachinesLock for "functional-682596"
	I1217 20:32:06.635002  420062 start.go:96] Skipping create...Using existing machine configuration
	I1217 20:32:06.635007  420062 fix.go:54] fixHost starting: 
	I1217 20:32:06.635262  420062 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:32:06.652755  420062 fix.go:112] recreateIfNeeded on functional-682596: state=Running err=<nil>
	W1217 20:32:06.652776  420062 fix.go:138] unexpected machine state, will restart: <nil>
	I1217 20:32:06.656001  420062 out.go:252] * Updating the running docker "functional-682596" container ...
	I1217 20:32:06.656027  420062 machine.go:94] provisionDockerMachine start ...
	I1217 20:32:06.656117  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:06.673371  420062 main.go:143] libmachine: Using SSH client type: native
	I1217 20:32:06.673711  420062 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:32:06.673717  420062 main.go:143] libmachine: About to run SSH command:
	hostname
	I1217 20:32:06.807817  420062 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:32:06.807832  420062 ubuntu.go:182] provisioning hostname "functional-682596"
	I1217 20:32:06.807905  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:06.825970  420062 main.go:143] libmachine: Using SSH client type: native
	I1217 20:32:06.826266  420062 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:32:06.826274  420062 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-682596 && echo "functional-682596" | sudo tee /etc/hostname
	I1217 20:32:06.965026  420062 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:32:06.965108  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:06.983394  420062 main.go:143] libmachine: Using SSH client type: native
	I1217 20:32:06.983695  420062 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:32:06.983710  420062 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-682596' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-682596/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-682596' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1217 20:32:07.116833  420062 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1217 20:32:07.116850  420062 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21808-367595/.minikube CaCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21808-367595/.minikube}
	I1217 20:32:07.116869  420062 ubuntu.go:190] setting up certificates
	I1217 20:32:07.116877  420062 provision.go:84] configureAuth start
	I1217 20:32:07.116947  420062 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:32:07.134531  420062 provision.go:143] copyHostCerts
	I1217 20:32:07.134601  420062 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem, removing ...
	I1217 20:32:07.134608  420062 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem
	I1217 20:32:07.134696  420062 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem (1082 bytes)
	I1217 20:32:07.134816  420062 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem, removing ...
	I1217 20:32:07.134820  420062 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem
	I1217 20:32:07.134849  420062 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem (1123 bytes)
	I1217 20:32:07.134907  420062 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem, removing ...
	I1217 20:32:07.134911  420062 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem
	I1217 20:32:07.134937  420062 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem (1679 bytes)
	I1217 20:32:07.134994  420062 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem org=jenkins.functional-682596 san=[127.0.0.1 192.168.49.2 functional-682596 localhost minikube]
	I1217 20:32:07.402222  420062 provision.go:177] copyRemoteCerts
	I1217 20:32:07.402275  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1217 20:32:07.402313  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.421789  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.516787  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1217 20:32:07.535734  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1217 20:32:07.553569  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1217 20:32:07.572193  420062 provision.go:87] duration metric: took 455.301945ms to configureAuth
	I1217 20:32:07.572211  420062 ubuntu.go:206] setting minikube options for container-runtime
	I1217 20:32:07.572513  420062 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:32:07.572520  420062 machine.go:97] duration metric: took 916.488302ms to provisionDockerMachine
	I1217 20:32:07.572527  420062 start.go:293] postStartSetup for "functional-682596" (driver="docker")
	I1217 20:32:07.572544  420062 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1217 20:32:07.572595  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1217 20:32:07.572635  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.593078  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.688373  420062 ssh_runner.go:195] Run: cat /etc/os-release
	I1217 20:32:07.691957  420062 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1217 20:32:07.691978  420062 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1217 20:32:07.691989  420062 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/addons for local assets ...
	I1217 20:32:07.692044  420062 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/files for local assets ...
	I1217 20:32:07.692122  420062 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> 3694612.pem in /etc/ssl/certs
	I1217 20:32:07.692197  420062 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts -> hosts in /etc/test/nested/copy/369461
	I1217 20:32:07.692238  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/369461
	I1217 20:32:07.699873  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:32:07.718147  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts --> /etc/test/nested/copy/369461/hosts (40 bytes)
	I1217 20:32:07.736089  420062 start.go:296] duration metric: took 163.546649ms for postStartSetup
	I1217 20:32:07.736163  420062 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1217 20:32:07.736210  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.753837  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.845496  420062 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1217 20:32:07.850448  420062 fix.go:56] duration metric: took 1.215434362s for fixHost
	I1217 20:32:07.850463  420062 start.go:83] releasing machines lock for "functional-682596", held for 1.215473649s
	I1217 20:32:07.850551  420062 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:32:07.871450  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:32:07.871498  420062 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:32:07.871505  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:32:07.871531  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:32:07.871602  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:32:07.871627  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:32:07.871680  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:32:07.871748  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:32:07.871798  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.889554  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.998672  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:32:08.024673  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:32:08.048014  420062 ssh_runner.go:195] Run: openssl version
	I1217 20:32:08.055454  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.065155  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:32:08.073391  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.077720  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.077778  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.119356  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:32:08.127518  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.135465  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:32:08.143207  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.147322  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.147376  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.188376  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:32:08.196028  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.203401  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:32:08.211111  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.214821  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.214891  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.256072  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:32:08.263331  420062 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-certificates >/dev/null 2>&1 && sudo update-ca-certificates || true"
	I1217 20:32:08.266724  420062 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-trust >/dev/null 2>&1 && sudo update-ca-trust extract || true"
	I1217 20:32:08.270040  420062 ssh_runner.go:195] Run: cat /version.json
	I1217 20:32:08.270111  420062 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1217 20:32:08.361093  420062 ssh_runner.go:195] Run: systemctl --version
	I1217 20:32:08.367706  420062 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1217 20:32:08.372063  420062 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1217 20:32:08.372127  420062 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1217 20:32:08.380119  420062 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1217 20:32:08.380133  420062 start.go:496] detecting cgroup driver to use...
	I1217 20:32:08.380163  420062 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1217 20:32:08.380223  420062 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1217 20:32:08.395765  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1217 20:32:08.409064  420062 docker.go:218] disabling cri-docker service (if available) ...
	I1217 20:32:08.409142  420062 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1217 20:32:08.425141  420062 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1217 20:32:08.438808  420062 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1217 20:32:08.558555  420062 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1217 20:32:08.681937  420062 docker.go:234] disabling docker service ...
	I1217 20:32:08.681997  420062 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1217 20:32:08.701323  420062 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1217 20:32:08.715923  420062 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1217 20:32:08.835610  420062 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1217 20:32:08.958372  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1217 20:32:08.972822  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1217 20:32:08.987570  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1217 20:32:08.997169  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1217 20:32:09.008742  420062 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1217 20:32:09.008821  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1217 20:32:09.018997  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:32:09.028318  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1217 20:32:09.037280  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:32:09.046375  420062 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1217 20:32:09.054925  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1217 20:32:09.064191  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1217 20:32:09.073303  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1217 20:32:09.082553  420062 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1217 20:32:09.090003  420062 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1217 20:32:09.097524  420062 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:32:09.216967  420062 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1217 20:32:09.360558  420062 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1217 20:32:09.360617  420062 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1217 20:32:09.364443  420062 start.go:564] Will wait 60s for crictl version
	I1217 20:32:09.364497  420062 ssh_runner.go:195] Run: which crictl
	I1217 20:32:09.368129  420062 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1217 20:32:09.397262  420062 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1217 20:32:09.397334  420062 ssh_runner.go:195] Run: containerd --version
	I1217 20:32:09.420778  420062 ssh_runner.go:195] Run: containerd --version
	I1217 20:32:09.446347  420062 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1217 20:32:09.449338  420062 cli_runner.go:164] Run: docker network inspect functional-682596 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1217 20:32:09.466521  420062 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1217 20:32:09.473221  420062 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1217 20:32:09.476024  420062 kubeadm.go:884] updating cluster {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMi
rror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1217 20:32:09.476173  420062 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:32:09.476285  420062 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:32:09.523837  420062 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:32:09.523848  420062 containerd.go:534] Images already preloaded, skipping extraction
	I1217 20:32:09.523905  420062 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:32:09.551003  420062 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:32:09.551014  420062 cache_images.go:86] Images are preloaded, skipping loading
	I1217 20:32:09.551021  420062 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1217 20:32:09.551143  420062 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-682596 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1217 20:32:09.551208  420062 ssh_runner.go:195] Run: sudo crictl info
	I1217 20:32:09.578643  420062 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1217 20:32:09.578665  420062 cni.go:84] Creating CNI manager for ""
	I1217 20:32:09.578673  420062 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:32:09.578683  420062 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1217 20:32:09.578707  420062 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-682596 NodeName:functional-682596 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubelet
ConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1217 20:32:09.578827  420062 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-682596"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1217 20:32:09.578904  420062 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1217 20:32:09.586879  420062 binaries.go:51] Found k8s binaries, skipping transfer
	I1217 20:32:09.586939  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1217 20:32:09.594505  420062 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1217 20:32:09.607281  420062 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1217 20:32:09.619808  420062 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2085 bytes)
	I1217 20:32:09.632685  420062 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1217 20:32:09.636364  420062 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:32:09.746796  420062 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1217 20:32:10.238623  420062 certs.go:69] Setting up /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596 for IP: 192.168.49.2
	I1217 20:32:10.238634  420062 certs.go:195] generating shared ca certs ...
	I1217 20:32:10.238650  420062 certs.go:227] acquiring lock for ca certs: {Name:mk528c7ee25f2f3d78de33f266a77f908cb2a9d0 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:32:10.238819  420062 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key
	I1217 20:32:10.238897  420062 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key
	I1217 20:32:10.238904  420062 certs.go:257] generating profile certs ...
	I1217 20:32:10.238995  420062 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key
	I1217 20:32:10.239044  420062 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key.0c30bf8d
	I1217 20:32:10.239082  420062 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key
	I1217 20:32:10.239190  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:32:10.239221  420062 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:32:10.239227  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:32:10.239261  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:32:10.239282  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:32:10.239304  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:32:10.239345  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:32:10.239934  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1217 20:32:10.261870  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1217 20:32:10.286466  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1217 20:32:10.307033  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1217 20:32:10.325172  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1217 20:32:10.343499  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1217 20:32:10.361814  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1217 20:32:10.379595  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1217 20:32:10.397590  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:32:10.415855  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:32:10.435021  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:32:10.453267  420062 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1217 20:32:10.466474  420062 ssh_runner.go:195] Run: openssl version
	I1217 20:32:10.472863  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.480366  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:32:10.487904  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.491724  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.491791  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.533110  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:32:10.540758  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.548093  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:32:10.555384  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.558983  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.559039  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.602447  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:32:10.609962  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.617251  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:32:10.625102  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.629186  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.629244  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.670572  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:32:10.678295  420062 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1217 20:32:10.682347  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1217 20:32:10.723286  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1217 20:32:10.764614  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1217 20:32:10.806369  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1217 20:32:10.856829  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1217 20:32:10.900136  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1217 20:32:10.941380  420062 kubeadm.go:401] StartCluster: {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirro
r: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:32:10.941458  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1217 20:32:10.941532  420062 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1217 20:32:10.973304  420062 cri.go:89] found id: ""
	I1217 20:32:10.973369  420062 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1217 20:32:10.981213  420062 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1217 20:32:10.981233  420062 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1217 20:32:10.981284  420062 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1217 20:32:10.989643  420062 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:10.990148  420062 kubeconfig.go:125] found "functional-682596" server: "https://192.168.49.2:8441"
	I1217 20:32:10.991404  420062 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1217 20:32:11.001770  420062 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-17 20:17:35.203485302 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-17 20:32:09.624537089 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1217 20:32:11.001793  420062 kubeadm.go:1161] stopping kube-system containers ...
	I1217 20:32:11.001810  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1217 20:32:11.001907  420062 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1217 20:32:11.031815  420062 cri.go:89] found id: ""
	I1217 20:32:11.031894  420062 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1217 20:32:11.052689  420062 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 20:32:11.061497  420062 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5635 Dec 17 20:21 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5640 Dec 17 20:21 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Dec 17 20:21 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Dec 17 20:21 /etc/kubernetes/scheduler.conf
	
	I1217 20:32:11.061561  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1217 20:32:11.069861  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1217 20:32:11.077903  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:11.077964  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 20:32:11.085969  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1217 20:32:11.094098  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:11.094177  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 20:32:11.102002  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1217 20:32:11.110213  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:11.110288  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 20:32:11.119148  420062 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1217 20:32:11.127567  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:11.176595  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.173518  420062 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.996897383s)
	I1217 20:32:13.173578  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.380045  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.450955  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.494559  420062 api_server.go:52] waiting for apiserver process to appear ...
	I1217 20:32:13.494629  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:13.995499  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:14.495246  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:14.995004  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:15.494932  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:15.995036  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:16.495074  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:16.994872  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:17.495380  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:17.995751  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:18.495343  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:18.994970  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:19.494770  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:19.994830  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:20.495505  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:20.994898  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:21.495023  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:21.995478  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:22.495349  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:22.995690  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:23.495439  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:23.995543  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:24.495694  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:24.995422  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:25.495295  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:25.994704  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:26.495710  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:26.995337  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:27.494832  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:27.995523  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:28.494851  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:28.995537  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:29.495464  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:29.994938  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:30.494723  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:30.995506  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:31.494922  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:31.995021  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:32.495513  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:32.995616  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:33.494819  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:33.995255  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:34.495487  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:34.994841  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:35.494829  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:35.994738  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:36.495064  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:36.995222  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:37.495670  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:37.995598  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:38.495022  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:38.994778  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:39.494800  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:39.995546  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:40.495339  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:40.995490  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:41.495730  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:41.995344  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:42.494837  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:42.994782  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:43.495499  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:43.994789  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:44.495147  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:44.994920  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:45.495463  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:45.994922  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:46.495042  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:46.994829  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:47.495629  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:47.994850  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:48.495359  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:48.994705  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:49.494785  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:49.995746  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:50.495699  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:50.994838  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:51.494890  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:51.995223  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:52.495608  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:52.995342  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:53.495633  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:53.994828  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:54.495690  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:54.995411  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:55.495390  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:55.994857  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:56.494814  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:56.995195  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:57.494792  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:57.995068  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:58.494828  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:58.995135  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:59.495101  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:59.994696  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:00.494847  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:00.994832  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:01.495150  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:01.994869  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:02.494983  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:02.995441  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:03.495150  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:03.994800  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:04.494955  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:04.995595  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:05.495571  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:05.995745  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:06.494913  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:06.994802  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:07.494809  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:07.995731  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:08.495034  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:08.995352  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:09.494830  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:09.995574  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:10.495663  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:10.995478  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:11.494754  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:11.995704  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:12.494787  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:12.995364  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:13.495637  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:13.495716  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:13.520703  420062 cri.go:89] found id: ""
	I1217 20:33:13.520717  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.520724  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:13.520729  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:13.520793  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:13.549658  420062 cri.go:89] found id: ""
	I1217 20:33:13.549672  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.549680  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:13.549685  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:13.549748  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:13.574860  420062 cri.go:89] found id: ""
	I1217 20:33:13.574873  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.574880  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:13.574885  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:13.574945  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:13.602159  420062 cri.go:89] found id: ""
	I1217 20:33:13.602173  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.602180  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:13.602185  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:13.602244  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:13.625735  420062 cri.go:89] found id: ""
	I1217 20:33:13.625748  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.625755  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:13.625760  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:13.625816  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:13.650446  420062 cri.go:89] found id: ""
	I1217 20:33:13.650460  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.650468  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:13.650473  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:13.650533  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:13.677915  420062 cri.go:89] found id: ""
	I1217 20:33:13.677929  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.677936  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:13.677944  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:13.677954  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:13.692434  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:13.692449  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:13.767790  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:13.758832   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.759470   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.761607   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.762393   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.763960   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:13.758832   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.759470   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.761607   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.762393   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.763960   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:13.767810  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:13.767820  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:13.839665  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:13.839685  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:13.872573  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:13.872589  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:16.429115  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:16.438989  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:16.439051  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:16.466518  420062 cri.go:89] found id: ""
	I1217 20:33:16.466532  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.466539  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:16.466545  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:16.466602  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:16.492200  420062 cri.go:89] found id: ""
	I1217 20:33:16.492213  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.492221  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:16.492226  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:16.492302  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:16.517055  420062 cri.go:89] found id: ""
	I1217 20:33:16.517070  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.517083  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:16.517088  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:16.517148  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:16.552138  420062 cri.go:89] found id: ""
	I1217 20:33:16.552152  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.552159  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:16.552165  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:16.552235  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:16.577184  420062 cri.go:89] found id: ""
	I1217 20:33:16.577198  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.577214  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:16.577220  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:16.577279  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:16.602039  420062 cri.go:89] found id: ""
	I1217 20:33:16.602053  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.602060  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:16.602066  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:16.602124  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:16.626732  420062 cri.go:89] found id: ""
	I1217 20:33:16.626745  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.626752  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:16.626760  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:16.626770  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:16.689454  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:16.689473  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:16.722345  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:16.722363  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:16.784686  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:16.784705  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:16.801895  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:16.801911  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:16.865697  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:16.856899   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.857554   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859279   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859924   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.861707   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:16.856899   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.857554   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859279   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859924   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.861707   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:19.365915  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:19.375998  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:19.376066  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:19.399955  420062 cri.go:89] found id: ""
	I1217 20:33:19.399968  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.399976  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:19.399981  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:19.400039  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:19.424668  420062 cri.go:89] found id: ""
	I1217 20:33:19.424682  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.424689  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:19.424695  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:19.424755  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:19.449865  420062 cri.go:89] found id: ""
	I1217 20:33:19.449879  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.449886  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:19.449891  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:19.449958  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:19.474803  420062 cri.go:89] found id: ""
	I1217 20:33:19.474816  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.474833  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:19.474838  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:19.474909  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:19.503551  420062 cri.go:89] found id: ""
	I1217 20:33:19.503579  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.503598  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:19.503603  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:19.503687  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:19.529232  420062 cri.go:89] found id: ""
	I1217 20:33:19.529246  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.529259  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:19.529264  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:19.529330  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:19.554443  420062 cri.go:89] found id: ""
	I1217 20:33:19.554456  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.554463  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:19.554481  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:19.554491  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:19.609391  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:19.609411  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:19.625653  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:19.625669  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:19.691445  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:19.683737   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.684184   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.685768   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.686180   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.687608   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:19.683737   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.684184   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.685768   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.686180   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.687608   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:19.691456  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:19.691466  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:19.754663  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:19.754682  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:22.297725  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:22.309139  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:22.309199  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:22.334369  420062 cri.go:89] found id: ""
	I1217 20:33:22.334382  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.334390  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:22.334395  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:22.334458  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:22.363418  420062 cri.go:89] found id: ""
	I1217 20:33:22.363445  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.363453  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:22.363458  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:22.363531  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:22.388924  420062 cri.go:89] found id: ""
	I1217 20:33:22.388939  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.388947  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:22.388993  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:22.389056  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:22.415757  420062 cri.go:89] found id: ""
	I1217 20:33:22.415780  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.415787  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:22.415793  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:22.415872  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:22.441520  420062 cri.go:89] found id: ""
	I1217 20:33:22.441534  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.441541  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:22.441546  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:22.441605  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:22.480775  420062 cri.go:89] found id: ""
	I1217 20:33:22.480789  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.480795  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:22.480801  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:22.480873  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:22.505556  420062 cri.go:89] found id: ""
	I1217 20:33:22.505570  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.505577  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:22.505585  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:22.505596  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:22.562036  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:22.562054  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:22.577369  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:22.577386  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:22.647423  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:22.638838   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.639486   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641272   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641956   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.643602   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:22.638838   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.639486   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641272   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641956   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.643602   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:22.647453  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:22.647464  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:22.710153  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:22.710173  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:25.239783  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:25.250945  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:25.251006  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:25.277422  420062 cri.go:89] found id: ""
	I1217 20:33:25.277435  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.277443  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:25.277448  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:25.277510  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:25.303032  420062 cri.go:89] found id: ""
	I1217 20:33:25.303051  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.303063  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:25.303070  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:25.303176  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:25.333183  420062 cri.go:89] found id: ""
	I1217 20:33:25.333197  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.333204  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:25.333209  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:25.333272  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:25.358899  420062 cri.go:89] found id: ""
	I1217 20:33:25.358913  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.358920  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:25.358926  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:25.358986  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:25.388611  420062 cri.go:89] found id: ""
	I1217 20:33:25.388625  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.388633  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:25.388638  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:25.388704  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:25.415829  420062 cri.go:89] found id: ""
	I1217 20:33:25.415844  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.415852  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:25.415857  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:25.415913  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:25.442921  420062 cri.go:89] found id: ""
	I1217 20:33:25.442935  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.442941  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:25.442949  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:25.442965  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:25.459113  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:25.459135  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:25.535629  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:25.526636   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.527172   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.528838   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.529443   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.530989   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:25.526636   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.527172   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.528838   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.529443   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.530989   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:25.535645  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:25.535655  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:25.601950  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:25.601968  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:25.634192  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:25.634208  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:28.190569  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:28.200504  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:28.200563  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:28.224311  420062 cri.go:89] found id: ""
	I1217 20:33:28.224325  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.224332  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:28.224338  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:28.224396  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:28.252603  420062 cri.go:89] found id: ""
	I1217 20:33:28.252622  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.252629  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:28.252634  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:28.252692  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:28.276684  420062 cri.go:89] found id: ""
	I1217 20:33:28.276697  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.276704  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:28.276709  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:28.276777  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:28.299922  420062 cri.go:89] found id: ""
	I1217 20:33:28.299935  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.299942  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:28.299947  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:28.300014  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:28.326124  420062 cri.go:89] found id: ""
	I1217 20:33:28.326137  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.326144  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:28.326150  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:28.326218  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:28.349497  420062 cri.go:89] found id: ""
	I1217 20:33:28.349510  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.349517  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:28.349523  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:28.349579  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:28.378156  420062 cri.go:89] found id: ""
	I1217 20:33:28.378170  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.378177  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:28.378185  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:28.378194  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:28.434254  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:28.434274  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:28.448810  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:28.448837  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:28.521268  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:28.512905   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.513656   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515366   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515890   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.517404   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:28.512905   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.513656   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515366   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515890   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.517404   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:28.521279  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:28.521290  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:28.584201  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:28.584222  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:31.112699  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:31.123315  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:31.123377  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:31.151761  420062 cri.go:89] found id: ""
	I1217 20:33:31.151776  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.151783  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:31.151789  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:31.151849  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:31.177165  420062 cri.go:89] found id: ""
	I1217 20:33:31.177178  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.177186  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:31.177191  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:31.177262  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:31.205229  420062 cri.go:89] found id: ""
	I1217 20:33:31.205260  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.205267  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:31.205272  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:31.205341  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:31.229570  420062 cri.go:89] found id: ""
	I1217 20:33:31.229584  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.229591  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:31.229597  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:31.229673  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:31.258880  420062 cri.go:89] found id: ""
	I1217 20:33:31.258904  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.258911  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:31.258917  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:31.258983  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:31.286222  420062 cri.go:89] found id: ""
	I1217 20:33:31.286241  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.286248  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:31.286253  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:31.286315  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:31.311291  420062 cri.go:89] found id: ""
	I1217 20:33:31.311314  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.311322  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:31.311330  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:31.311340  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:31.342524  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:31.342541  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:31.398421  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:31.398440  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:31.413476  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:31.413497  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:31.478376  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:31.469734   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.470537   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472118   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472657   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.474358   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:31.469734   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.470537   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472118   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472657   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.474358   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:31.478388  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:31.478398  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:34.044394  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:34.054571  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:34.054632  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:34.078791  420062 cri.go:89] found id: ""
	I1217 20:33:34.078815  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.078822  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:34.078827  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:34.078902  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:34.103484  420062 cri.go:89] found id: ""
	I1217 20:33:34.103498  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.103505  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:34.103510  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:34.103578  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:34.128330  420062 cri.go:89] found id: ""
	I1217 20:33:34.128343  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.128362  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:34.128368  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:34.128436  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:34.156115  420062 cri.go:89] found id: ""
	I1217 20:33:34.156129  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.156136  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:34.156141  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:34.156208  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:34.179862  420062 cri.go:89] found id: ""
	I1217 20:33:34.179876  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.179884  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:34.179889  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:34.179959  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:34.205717  420062 cri.go:89] found id: ""
	I1217 20:33:34.205731  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.205739  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:34.205745  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:34.205804  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:34.230674  420062 cri.go:89] found id: ""
	I1217 20:33:34.230689  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.230702  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:34.230710  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:34.230720  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:34.286930  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:34.286949  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:34.301786  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:34.301803  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:34.365439  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:34.357724   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.358190   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.359660   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.360034   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.361429   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:34.357724   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.358190   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.359660   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.360034   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.361429   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:34.365461  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:34.365473  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:34.426703  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:34.426724  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:36.954941  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:36.964889  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:36.964949  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:37.000981  420062 cri.go:89] found id: ""
	I1217 20:33:37.000999  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.001008  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:37.001014  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:37.001098  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:37.036987  420062 cri.go:89] found id: ""
	I1217 20:33:37.037001  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.037008  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:37.037013  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:37.037083  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:37.067078  420062 cri.go:89] found id: ""
	I1217 20:33:37.067092  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.067099  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:37.067105  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:37.067173  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:37.101494  420062 cri.go:89] found id: ""
	I1217 20:33:37.101509  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.101516  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:37.101522  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:37.101582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:37.125577  420062 cri.go:89] found id: ""
	I1217 20:33:37.125591  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.125599  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:37.125604  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:37.125672  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:37.155006  420062 cri.go:89] found id: ""
	I1217 20:33:37.155022  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.155040  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:37.155045  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:37.155105  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:37.180061  420062 cri.go:89] found id: ""
	I1217 20:33:37.180075  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.180082  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:37.180090  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:37.180110  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:37.235716  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:37.235744  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:37.250676  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:37.250704  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:37.314789  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:37.307219   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.307729   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309210   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309555   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.311019   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:37.307219   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.307729   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309210   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309555   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.311019   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:37.314799  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:37.314811  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:37.376546  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:37.376566  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:39.904036  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:39.914146  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:39.914209  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:39.942353  420062 cri.go:89] found id: ""
	I1217 20:33:39.942366  420062 logs.go:282] 0 containers: []
	W1217 20:33:39.942374  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:39.942379  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:39.942445  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:39.970090  420062 cri.go:89] found id: ""
	I1217 20:33:39.970105  420062 logs.go:282] 0 containers: []
	W1217 20:33:39.970113  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:39.970119  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:39.970185  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:40.013204  420062 cri.go:89] found id: ""
	I1217 20:33:40.013220  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.013228  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:40.013234  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:40.013312  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:40.055438  420062 cri.go:89] found id: ""
	I1217 20:33:40.055453  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.055461  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:40.055467  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:40.055532  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:40.088240  420062 cri.go:89] found id: ""
	I1217 20:33:40.088285  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.088293  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:40.088298  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:40.088361  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:40.116666  420062 cri.go:89] found id: ""
	I1217 20:33:40.116680  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.116687  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:40.116693  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:40.116752  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:40.143935  420062 cri.go:89] found id: ""
	I1217 20:33:40.143951  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.143965  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:40.143973  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:40.143986  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:40.199464  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:40.199484  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:40.214665  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:40.214682  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:40.285603  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:40.277391   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.277927   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.279526   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.280079   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.281668   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:40.277391   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.277927   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.279526   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.280079   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.281668   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:40.285613  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:40.285623  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:40.348551  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:40.348571  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:42.882366  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:42.892346  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:42.892407  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:42.917526  420062 cri.go:89] found id: ""
	I1217 20:33:42.917540  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.917548  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:42.917553  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:42.917622  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:42.941649  420062 cri.go:89] found id: ""
	I1217 20:33:42.941663  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.941670  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:42.941675  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:42.941737  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:42.965314  420062 cri.go:89] found id: ""
	I1217 20:33:42.965328  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.965335  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:42.965341  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:42.965399  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:42.992861  420062 cri.go:89] found id: ""
	I1217 20:33:42.992875  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.992882  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:42.992888  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:42.992949  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:43.026962  420062 cri.go:89] found id: ""
	I1217 20:33:43.026977  420062 logs.go:282] 0 containers: []
	W1217 20:33:43.026984  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:43.026989  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:43.027048  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:43.056268  420062 cri.go:89] found id: ""
	I1217 20:33:43.056282  420062 logs.go:282] 0 containers: []
	W1217 20:33:43.056289  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:43.056295  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:43.056353  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:43.088527  420062 cri.go:89] found id: ""
	I1217 20:33:43.088542  420062 logs.go:282] 0 containers: []
	W1217 20:33:43.088549  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:43.088556  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:43.088567  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:43.115028  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:43.115044  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:43.170239  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:43.170258  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:43.185453  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:43.185468  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:43.255155  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:43.247293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.247760   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249636   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.251132   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:43.247293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.247760   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249636   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.251132   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:43.255166  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:43.255176  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:45.818750  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:45.829020  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:45.829084  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:45.854296  420062 cri.go:89] found id: ""
	I1217 20:33:45.854310  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.854319  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:45.854327  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:45.854393  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:45.884706  420062 cri.go:89] found id: ""
	I1217 20:33:45.884720  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.884728  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:45.884733  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:45.884795  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:45.909518  420062 cri.go:89] found id: ""
	I1217 20:33:45.909533  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.909540  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:45.909545  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:45.909615  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:45.935050  420062 cri.go:89] found id: ""
	I1217 20:33:45.935065  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.935073  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:45.935078  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:45.935155  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:45.964622  420062 cri.go:89] found id: ""
	I1217 20:33:45.964636  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.964643  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:45.964648  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:45.964714  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:45.992340  420062 cri.go:89] found id: ""
	I1217 20:33:45.992355  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.992363  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:45.992368  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:45.992432  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:46.029800  420062 cri.go:89] found id: ""
	I1217 20:33:46.029815  420062 logs.go:282] 0 containers: []
	W1217 20:33:46.029822  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:46.029841  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:46.029852  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:46.096203  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:46.096224  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:46.111499  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:46.111517  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:46.174259  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:46.165992   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.166754   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168484   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168848   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.170379   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:46.165992   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.166754   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168484   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168848   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.170379   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:46.174269  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:46.174282  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:46.239891  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:46.239911  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:48.769726  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:48.779731  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:48.779796  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:48.803697  420062 cri.go:89] found id: ""
	I1217 20:33:48.803710  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.803718  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:48.803723  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:48.803790  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:48.828947  420062 cri.go:89] found id: ""
	I1217 20:33:48.828966  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.828974  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:48.828979  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:48.829045  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:48.853794  420062 cri.go:89] found id: ""
	I1217 20:33:48.853809  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.853815  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:48.853821  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:48.853884  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:48.879220  420062 cri.go:89] found id: ""
	I1217 20:33:48.879234  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.879241  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:48.879253  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:48.879316  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:48.905546  420062 cri.go:89] found id: ""
	I1217 20:33:48.905560  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.905567  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:48.905573  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:48.905639  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:48.931025  420062 cri.go:89] found id: ""
	I1217 20:33:48.931040  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.931047  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:48.931053  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:48.931111  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:48.959554  420062 cri.go:89] found id: ""
	I1217 20:33:48.959567  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.959575  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:48.959591  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:48.959603  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:49.037548  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:49.028333   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.029097   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.030655   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.031218   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.033613   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:49.028333   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.029097   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.030655   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.031218   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.033613   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:49.037558  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:49.037576  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:49.104606  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:49.104628  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:49.132120  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:49.132142  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:49.189781  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:49.189799  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:51.705313  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:51.715310  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:51.715375  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:51.742788  420062 cri.go:89] found id: ""
	I1217 20:33:51.742803  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.742810  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:51.742816  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:51.742878  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:51.768132  420062 cri.go:89] found id: ""
	I1217 20:33:51.768147  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.768154  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:51.768160  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:51.768220  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:51.796803  420062 cri.go:89] found id: ""
	I1217 20:33:51.796817  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.796825  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:51.796831  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:51.796891  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:51.823032  420062 cri.go:89] found id: ""
	I1217 20:33:51.823046  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.823054  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:51.823061  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:51.823122  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:51.848750  420062 cri.go:89] found id: ""
	I1217 20:33:51.848765  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.848773  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:51.848778  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:51.848840  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:51.874494  420062 cri.go:89] found id: ""
	I1217 20:33:51.874509  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.874516  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:51.874522  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:51.874582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:51.912240  420062 cri.go:89] found id: ""
	I1217 20:33:51.912273  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.912281  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:51.912290  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:51.912301  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:51.940881  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:51.940897  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:51.997574  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:51.997596  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:52.016000  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:52.016018  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:52.093264  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:52.084701   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.085399   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087055   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087666   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.089311   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:52.084701   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.085399   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087055   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087666   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.089311   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:52.093274  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:52.093286  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:54.657449  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:54.667679  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:54.667741  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:54.696106  420062 cri.go:89] found id: ""
	I1217 20:33:54.696121  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.696128  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:54.696133  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:54.696194  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:54.720578  420062 cri.go:89] found id: ""
	I1217 20:33:54.720592  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.720599  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:54.720605  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:54.720669  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:54.746036  420062 cri.go:89] found id: ""
	I1217 20:33:54.746050  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.746058  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:54.746063  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:54.746122  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:54.770192  420062 cri.go:89] found id: ""
	I1217 20:33:54.770206  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.770213  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:54.770219  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:54.770275  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:54.794365  420062 cri.go:89] found id: ""
	I1217 20:33:54.794379  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.794386  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:54.794391  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:54.794454  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:54.818424  420062 cri.go:89] found id: ""
	I1217 20:33:54.818438  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.818446  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:54.818451  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:54.818513  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:54.843360  420062 cri.go:89] found id: ""
	I1217 20:33:54.843375  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.843382  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:54.843401  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:54.843412  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:54.872684  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:54.872701  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:54.928831  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:54.928851  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:54.943545  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:54.943561  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:55.020697  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:55.008146   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.009058   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.010012   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011180   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011994   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:55.008146   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.009058   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.010012   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011180   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011994   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:55.020721  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:55.020734  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:57.590507  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:57.600840  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:57.600911  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:57.628650  420062 cri.go:89] found id: ""
	I1217 20:33:57.628664  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.628671  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:57.628676  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:57.628736  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:57.653915  420062 cri.go:89] found id: ""
	I1217 20:33:57.653929  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.653936  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:57.653941  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:57.654005  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:57.677881  420062 cri.go:89] found id: ""
	I1217 20:33:57.677894  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.677901  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:57.677906  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:57.677974  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:57.701808  420062 cri.go:89] found id: ""
	I1217 20:33:57.701823  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.701830  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:57.701836  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:57.701894  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:57.725682  420062 cri.go:89] found id: ""
	I1217 20:33:57.725696  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.725703  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:57.725708  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:57.725770  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:57.753864  420062 cri.go:89] found id: ""
	I1217 20:33:57.753878  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.753885  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:57.753891  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:57.753948  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:57.779180  420062 cri.go:89] found id: ""
	I1217 20:33:57.779193  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.779200  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:57.779216  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:57.779227  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:57.834554  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:57.834575  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:57.849468  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:57.849484  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:57.917796  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:57.910011   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.910781   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912353   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912882   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.913951   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:57.910011   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.910781   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912353   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912882   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.913951   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:57.917816  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:57.917827  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:57.980535  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:57.980556  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:00.519246  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:00.531028  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:00.531090  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:00.557919  420062 cri.go:89] found id: ""
	I1217 20:34:00.557933  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.557941  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:00.557947  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:00.558006  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:00.583357  420062 cri.go:89] found id: ""
	I1217 20:34:00.583381  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.583389  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:00.583394  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:00.583461  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:00.608300  420062 cri.go:89] found id: ""
	I1217 20:34:00.608313  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.608321  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:00.608326  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:00.608396  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:00.633249  420062 cri.go:89] found id: ""
	I1217 20:34:00.633263  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.633271  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:00.633277  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:00.633354  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:00.657998  420062 cri.go:89] found id: ""
	I1217 20:34:00.658012  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.658020  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:00.658025  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:00.658083  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:00.686479  420062 cri.go:89] found id: ""
	I1217 20:34:00.686494  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.686502  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:00.686517  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:00.686600  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:00.715237  420062 cri.go:89] found id: ""
	I1217 20:34:00.715251  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.715259  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:00.715281  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:00.715297  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:00.771736  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:00.771756  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:00.786569  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:00.786584  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:00.855532  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:00.846820   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.847617   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849290   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849821   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.851435   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:00.846820   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.847617   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849290   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849821   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.851435   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:00.855544  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:00.855556  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:00.929889  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:00.929917  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:03.457778  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:03.467767  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:03.467830  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:03.491745  420062 cri.go:89] found id: ""
	I1217 20:34:03.491760  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.491767  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:03.491772  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:03.491834  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:03.516486  420062 cri.go:89] found id: ""
	I1217 20:34:03.516501  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.516508  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:03.516514  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:03.516573  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:03.545504  420062 cri.go:89] found id: ""
	I1217 20:34:03.545518  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.545526  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:03.545531  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:03.545592  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:03.570752  420062 cri.go:89] found id: ""
	I1217 20:34:03.570766  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.570773  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:03.570779  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:03.570837  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:03.599464  420062 cri.go:89] found id: ""
	I1217 20:34:03.599478  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.599486  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:03.599491  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:03.599551  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:03.626193  420062 cri.go:89] found id: ""
	I1217 20:34:03.626209  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.626217  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:03.626222  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:03.626280  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:03.650682  420062 cri.go:89] found id: ""
	I1217 20:34:03.650696  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.650704  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:03.650712  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:03.650724  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:03.712614  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:03.705244   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.705869   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.706805   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.707331   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.708827   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:03.705244   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.705869   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.706805   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.707331   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.708827   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:03.712625  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:03.712636  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:03.775226  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:03.775247  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:03.801581  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:03.801600  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:03.857991  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:03.858013  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:06.373018  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:06.382912  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:06.382972  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:06.408596  420062 cri.go:89] found id: ""
	I1217 20:34:06.408610  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.408617  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:06.408622  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:06.408681  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:06.437062  420062 cri.go:89] found id: ""
	I1217 20:34:06.437076  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.437083  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:06.437088  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:06.437149  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:06.463109  420062 cri.go:89] found id: ""
	I1217 20:34:06.463123  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.463130  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:06.463135  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:06.463198  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:06.487450  420062 cri.go:89] found id: ""
	I1217 20:34:06.487463  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.487470  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:06.487476  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:06.487537  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:06.512848  420062 cri.go:89] found id: ""
	I1217 20:34:06.512863  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.512870  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:06.512876  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:06.512939  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:06.536984  420062 cri.go:89] found id: ""
	I1217 20:34:06.536998  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.537006  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:06.537011  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:06.537069  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:06.565689  420062 cri.go:89] found id: ""
	I1217 20:34:06.565732  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.565740  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:06.565748  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:06.565758  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:06.626274  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:06.626294  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:06.641612  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:06.641630  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:06.703082  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:06.694717   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.695365   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697091   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697739   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.699357   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:06.694717   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.695365   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697091   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697739   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.699357   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:06.703092  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:06.703104  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:06.768202  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:06.768221  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:09.296397  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:09.306558  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:09.306619  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:09.330814  420062 cri.go:89] found id: ""
	I1217 20:34:09.330828  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.330836  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:09.330841  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:09.330900  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:09.360228  420062 cri.go:89] found id: ""
	I1217 20:34:09.360242  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.360270  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:09.360276  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:09.360336  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:09.383852  420062 cri.go:89] found id: ""
	I1217 20:34:09.383865  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.383871  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:09.383876  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:09.383933  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:09.408740  420062 cri.go:89] found id: ""
	I1217 20:34:09.408753  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.408760  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:09.408765  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:09.408824  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:09.433879  420062 cri.go:89] found id: ""
	I1217 20:34:09.433894  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.433901  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:09.433907  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:09.433965  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:09.458138  420062 cri.go:89] found id: ""
	I1217 20:34:09.458152  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.458160  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:09.458165  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:09.458223  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:09.482170  420062 cri.go:89] found id: ""
	I1217 20:34:09.482184  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.482191  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:09.482199  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:09.482214  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:09.539809  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:09.539831  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:09.555108  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:09.555124  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:09.617755  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:09.608834   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.609449   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611182   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611721   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.613344   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:09.608834   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.609449   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611182   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611721   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.613344   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:09.617779  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:09.617790  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:09.680900  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:09.680920  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:12.217262  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:12.227378  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:12.227441  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:12.260904  420062 cri.go:89] found id: ""
	I1217 20:34:12.260918  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.260926  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:12.260931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:12.260991  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:12.290600  420062 cri.go:89] found id: ""
	I1217 20:34:12.290614  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.290621  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:12.290626  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:12.290694  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:12.317694  420062 cri.go:89] found id: ""
	I1217 20:34:12.317708  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.317716  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:12.317721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:12.317789  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:12.347280  420062 cri.go:89] found id: ""
	I1217 20:34:12.347300  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.347308  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:12.347323  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:12.347382  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:12.375032  420062 cri.go:89] found id: ""
	I1217 20:34:12.375046  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.375054  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:12.375060  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:12.375121  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:12.400749  420062 cri.go:89] found id: ""
	I1217 20:34:12.400763  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.400771  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:12.400779  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:12.400837  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:12.425915  420062 cri.go:89] found id: ""
	I1217 20:34:12.425929  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.425937  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:12.425946  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:12.425957  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:12.486250  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:12.486269  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:12.501500  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:12.501515  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:12.571896  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:12.563218   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564136   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564679   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566300   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566801   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:12.563218   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564136   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564679   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566300   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566801   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:12.571906  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:12.571921  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:12.635853  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:12.635876  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:15.166604  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:15.177581  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:15.177645  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:15.201800  420062 cri.go:89] found id: ""
	I1217 20:34:15.201815  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.201822  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:15.201828  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:15.201892  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:15.229609  420062 cri.go:89] found id: ""
	I1217 20:34:15.229624  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.229631  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:15.229636  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:15.229703  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:15.257583  420062 cri.go:89] found id: ""
	I1217 20:34:15.257597  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.257605  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:15.257610  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:15.257673  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:15.291085  420062 cri.go:89] found id: ""
	I1217 20:34:15.291099  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.291106  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:15.291112  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:15.291190  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:15.324198  420062 cri.go:89] found id: ""
	I1217 20:34:15.324212  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.324219  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:15.324226  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:15.324317  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:15.348977  420062 cri.go:89] found id: ""
	I1217 20:34:15.348991  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.348998  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:15.349004  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:15.349069  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:15.373132  420062 cri.go:89] found id: ""
	I1217 20:34:15.373147  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.373155  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:15.373162  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:15.373174  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:15.387711  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:15.387728  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:15.453164  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:15.443181   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.443915   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.445657   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.447470   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.448047   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:15.443181   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.443915   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.445657   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.447470   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.448047   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:15.453175  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:15.453187  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:15.519197  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:15.519219  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:15.547781  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:15.547799  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:18.106475  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:18.117557  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:18.117619  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:18.142233  420062 cri.go:89] found id: ""
	I1217 20:34:18.142246  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.142253  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:18.142258  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:18.142319  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:18.166913  420062 cri.go:89] found id: ""
	I1217 20:34:18.166927  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.166934  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:18.166940  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:18.167002  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:18.195856  420062 cri.go:89] found id: ""
	I1217 20:34:18.195870  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.195877  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:18.195883  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:18.195944  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:18.222291  420062 cri.go:89] found id: ""
	I1217 20:34:18.222306  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.222313  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:18.222318  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:18.222382  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:18.254911  420062 cri.go:89] found id: ""
	I1217 20:34:18.254925  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.254932  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:18.254937  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:18.254996  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:18.299082  420062 cri.go:89] found id: ""
	I1217 20:34:18.299096  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.299103  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:18.299109  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:18.299173  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:18.323848  420062 cri.go:89] found id: ""
	I1217 20:34:18.323862  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.323869  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:18.323877  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:18.323888  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:18.381056  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:18.381082  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:18.395602  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:18.395617  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:18.459223  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:18.450909   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.451543   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453107   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453711   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.455276   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:18.450909   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.451543   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453107   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453711   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.455276   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:18.459233  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:18.459244  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:18.522287  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:18.522307  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:21.051832  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:21.062206  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:21.062275  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:21.090124  420062 cri.go:89] found id: ""
	I1217 20:34:21.090139  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.090146  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:21.090151  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:21.090211  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:21.114268  420062 cri.go:89] found id: ""
	I1217 20:34:21.114282  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.114289  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:21.114294  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:21.114357  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:21.141585  420062 cri.go:89] found id: ""
	I1217 20:34:21.141599  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.141606  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:21.141611  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:21.141673  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:21.167173  420062 cri.go:89] found id: ""
	I1217 20:34:21.167187  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.167195  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:21.167200  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:21.167277  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:21.191543  420062 cri.go:89] found id: ""
	I1217 20:34:21.191557  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.191564  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:21.191569  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:21.191640  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:21.219365  420062 cri.go:89] found id: ""
	I1217 20:34:21.219378  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.219385  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:21.219390  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:21.219451  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:21.256303  420062 cri.go:89] found id: ""
	I1217 20:34:21.256317  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.256324  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:21.256332  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:21.256342  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:21.323014  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:21.323035  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:21.337647  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:21.337664  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:21.400131  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:21.391524   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.392409   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394006   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394305   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.395921   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:21.391524   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.392409   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394006   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394305   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.395921   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:21.400140  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:21.400151  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:21.467704  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:21.467725  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:23.996278  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:24.008421  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:24.008487  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:24.035322  420062 cri.go:89] found id: ""
	I1217 20:34:24.035336  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.035344  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:24.035349  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:24.035413  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:24.060026  420062 cri.go:89] found id: ""
	I1217 20:34:24.060040  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.060048  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:24.060054  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:24.060131  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:24.085236  420062 cri.go:89] found id: ""
	I1217 20:34:24.085250  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.085257  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:24.085263  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:24.085323  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:24.110730  420062 cri.go:89] found id: ""
	I1217 20:34:24.110763  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.110772  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:24.110778  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:24.110851  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:24.138006  420062 cri.go:89] found id: ""
	I1217 20:34:24.138020  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.138028  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:24.138034  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:24.138094  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:24.168065  420062 cri.go:89] found id: ""
	I1217 20:34:24.168080  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.168094  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:24.168100  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:24.168172  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:24.193244  420062 cri.go:89] found id: ""
	I1217 20:34:24.193258  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.193265  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:24.193273  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:24.193284  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:24.260181  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:24.260201  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:24.299429  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:24.299446  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:24.355633  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:24.355653  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:24.371493  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:24.371508  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:24.439767  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:24.431274   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.432160   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.433728   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.434387   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.435768   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:24.431274   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.432160   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.433728   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.434387   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.435768   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:26.940651  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:26.951081  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:26.951148  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:26.975583  420062 cri.go:89] found id: ""
	I1217 20:34:26.975598  420062 logs.go:282] 0 containers: []
	W1217 20:34:26.975606  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:26.975611  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:26.975671  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:27.003924  420062 cri.go:89] found id: ""
	I1217 20:34:27.003939  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.003948  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:27.003954  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:27.004018  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:27.029433  420062 cri.go:89] found id: ""
	I1217 20:34:27.029446  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.029454  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:27.029460  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:27.029520  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:27.055977  420062 cri.go:89] found id: ""
	I1217 20:34:27.055990  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.055998  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:27.056027  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:27.056093  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:27.081756  420062 cri.go:89] found id: ""
	I1217 20:34:27.081770  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.081777  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:27.081783  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:27.081846  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:27.106532  420062 cri.go:89] found id: ""
	I1217 20:34:27.106546  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.106554  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:27.106587  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:27.106651  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:27.131573  420062 cri.go:89] found id: ""
	I1217 20:34:27.131587  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.131595  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:27.131603  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:27.131613  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:27.194270  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:27.194290  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:27.222438  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:27.222453  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:27.284134  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:27.284154  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:27.300336  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:27.300352  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:27.369337  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:27.360889   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.361563   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363199   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363762   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.365407   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:27.360889   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.361563   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363199   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363762   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.365407   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:29.871004  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:29.881325  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:29.881389  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:29.906739  420062 cri.go:89] found id: ""
	I1217 20:34:29.906753  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.906760  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:29.906766  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:29.906828  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:29.935023  420062 cri.go:89] found id: ""
	I1217 20:34:29.935037  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.935045  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:29.935049  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:29.935110  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:29.968427  420062 cri.go:89] found id: ""
	I1217 20:34:29.968442  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.968449  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:29.968454  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:29.968514  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:29.993120  420062 cri.go:89] found id: ""
	I1217 20:34:29.993133  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.993141  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:29.993147  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:29.993208  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:30.038216  420062 cri.go:89] found id: ""
	I1217 20:34:30.038232  420062 logs.go:282] 0 containers: []
	W1217 20:34:30.038240  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:30.038256  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:30.038331  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:30.088044  420062 cri.go:89] found id: ""
	I1217 20:34:30.088059  420062 logs.go:282] 0 containers: []
	W1217 20:34:30.088067  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:30.088080  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:30.088145  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:30.116773  420062 cri.go:89] found id: ""
	I1217 20:34:30.116789  420062 logs.go:282] 0 containers: []
	W1217 20:34:30.116798  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:30.116808  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:30.116819  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:30.175618  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:30.175638  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:30.191950  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:30.191967  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:30.268938  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:30.259892   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.260676   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262229   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262537   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.263971   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:30.259892   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.260676   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262229   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262537   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.263971   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:30.268949  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:30.268960  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:30.345609  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:30.345631  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:32.873852  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:32.884009  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:32.884072  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:32.908673  420062 cri.go:89] found id: ""
	I1217 20:34:32.908688  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.908696  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:32.908701  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:32.908761  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:32.933101  420062 cri.go:89] found id: ""
	I1217 20:34:32.933115  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.933122  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:32.933127  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:32.933192  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:32.956968  420062 cri.go:89] found id: ""
	I1217 20:34:32.956982  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.956991  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:32.956996  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:32.957054  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:32.982228  420062 cri.go:89] found id: ""
	I1217 20:34:32.982241  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.982249  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:32.982254  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:32.982312  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:33.011791  420062 cri.go:89] found id: ""
	I1217 20:34:33.011805  420062 logs.go:282] 0 containers: []
	W1217 20:34:33.011812  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:33.011818  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:33.011885  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:33.038878  420062 cri.go:89] found id: ""
	I1217 20:34:33.038894  420062 logs.go:282] 0 containers: []
	W1217 20:34:33.038901  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:33.038907  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:33.038969  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:33.068421  420062 cri.go:89] found id: ""
	I1217 20:34:33.068436  420062 logs.go:282] 0 containers: []
	W1217 20:34:33.068443  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:33.068453  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:33.068463  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:33.083444  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:33.083461  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:33.147593  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:33.139067   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.139640   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141533   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141989   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.143520   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:33.139067   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.139640   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141533   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141989   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.143520   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:33.147604  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:33.147617  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:33.211005  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:33.211025  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:33.247311  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:33.247327  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:35.820692  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:35.830805  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:35.830879  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:35.855694  420062 cri.go:89] found id: ""
	I1217 20:34:35.855708  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.855716  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:35.855721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:35.855780  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:35.879932  420062 cri.go:89] found id: ""
	I1217 20:34:35.879947  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.879955  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:35.879960  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:35.880021  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:35.904606  420062 cri.go:89] found id: ""
	I1217 20:34:35.904622  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.904630  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:35.904635  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:35.904700  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:35.932655  420062 cri.go:89] found id: ""
	I1217 20:34:35.932669  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.932676  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:35.932681  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:35.932742  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:35.956665  420062 cri.go:89] found id: ""
	I1217 20:34:35.956679  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.956686  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:35.956691  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:35.956748  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:35.981363  420062 cri.go:89] found id: ""
	I1217 20:34:35.981377  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.981385  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:35.981391  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:35.981450  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:36.013052  420062 cri.go:89] found id: ""
	I1217 20:34:36.013068  420062 logs.go:282] 0 containers: []
	W1217 20:34:36.013076  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:36.013084  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:36.013097  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:36.080346  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:36.080367  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:36.109280  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:36.109296  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:36.168612  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:36.168630  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:36.183490  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:36.183505  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:36.254206  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:36.245334   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.246226   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.247937   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.248300   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.249802   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:36.245334   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.246226   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.247937   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.248300   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.249802   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:38.754461  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:38.764820  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:38.764885  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:38.790226  420062 cri.go:89] found id: ""
	I1217 20:34:38.790243  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.790251  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:38.790257  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:38.790317  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:38.815898  420062 cri.go:89] found id: ""
	I1217 20:34:38.815913  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.815920  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:38.815925  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:38.815986  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:38.840879  420062 cri.go:89] found id: ""
	I1217 20:34:38.840894  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.840901  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:38.840907  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:38.840967  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:38.865756  420062 cri.go:89] found id: ""
	I1217 20:34:38.865772  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.865780  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:38.865785  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:38.865851  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:38.893497  420062 cri.go:89] found id: ""
	I1217 20:34:38.893511  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.893518  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:38.893523  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:38.893582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:38.918737  420062 cri.go:89] found id: ""
	I1217 20:34:38.918751  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.918758  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:38.918763  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:38.918821  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:38.943126  420062 cri.go:89] found id: ""
	I1217 20:34:38.943140  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.943147  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:38.943155  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:38.943166  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:39.008933  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:38.999020   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.000025   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.001953   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.002737   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.004715   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:38.999020   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.000025   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.001953   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.002737   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.004715   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:39.008944  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:39.008955  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:39.071529  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:39.071550  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:39.098851  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:39.098866  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:39.157559  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:39.157578  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:41.673292  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:41.683569  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:41.683631  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:41.712444  420062 cri.go:89] found id: ""
	I1217 20:34:41.712458  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.712466  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:41.712471  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:41.712540  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:41.737230  420062 cri.go:89] found id: ""
	I1217 20:34:41.737244  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.737253  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:41.737258  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:41.737320  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:41.765904  420062 cri.go:89] found id: ""
	I1217 20:34:41.765918  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.765926  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:41.765931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:41.765993  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:41.790803  420062 cri.go:89] found id: ""
	I1217 20:34:41.790818  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.790826  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:41.790831  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:41.790891  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:41.816378  420062 cri.go:89] found id: ""
	I1217 20:34:41.816393  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.816399  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:41.816405  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:41.816465  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:41.846163  420062 cri.go:89] found id: ""
	I1217 20:34:41.846177  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.846184  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:41.846190  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:41.846249  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:41.874235  420062 cri.go:89] found id: ""
	I1217 20:34:41.874249  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.874257  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:41.874264  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:41.874278  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:41.930007  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:41.930025  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:41.944733  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:41.944748  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:42.015145  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:42.005958   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.007326   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.008416   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009480   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009948   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:42.005958   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.007326   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.008416   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009480   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009948   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:42.015157  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:42.015168  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:42.083018  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:42.083046  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:44.617783  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:44.627898  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:44.627959  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:44.654510  420062 cri.go:89] found id: ""
	I1217 20:34:44.654524  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.654531  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:44.654536  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:44.654600  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:44.681532  420062 cri.go:89] found id: ""
	I1217 20:34:44.681547  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.681554  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:44.681560  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:44.681620  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:44.705927  420062 cri.go:89] found id: ""
	I1217 20:34:44.705941  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.705948  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:44.705953  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:44.706010  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:44.730835  420062 cri.go:89] found id: ""
	I1217 20:34:44.730849  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.730857  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:44.730862  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:44.730925  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:44.754987  420062 cri.go:89] found id: ""
	I1217 20:34:44.755002  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.755009  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:44.755014  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:44.755074  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:44.778787  420062 cri.go:89] found id: ""
	I1217 20:34:44.778801  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.778808  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:44.778814  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:44.778874  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:44.804370  420062 cri.go:89] found id: ""
	I1217 20:34:44.804385  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.804392  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:44.804401  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:44.804411  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:44.870852  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:44.870872  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:44.901529  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:44.901545  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:44.961405  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:44.961428  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:44.976411  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:44.976427  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:45.055180  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:45.045055   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.046486   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.047127   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.048790   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.049451   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:45.045055   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.046486   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.047127   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.048790   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.049451   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:47.555437  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:47.565320  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:47.565380  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:47.594473  420062 cri.go:89] found id: ""
	I1217 20:34:47.594488  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.594495  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:47.594500  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:47.594560  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:47.618819  420062 cri.go:89] found id: ""
	I1217 20:34:47.618833  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.618840  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:47.618845  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:47.618906  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:47.643299  420062 cri.go:89] found id: ""
	I1217 20:34:47.643313  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.643320  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:47.643325  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:47.643386  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:47.668500  420062 cri.go:89] found id: ""
	I1217 20:34:47.668514  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.668522  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:47.668527  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:47.668588  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:47.694650  420062 cri.go:89] found id: ""
	I1217 20:34:47.694671  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.694678  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:47.694683  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:47.694745  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:47.729169  420062 cri.go:89] found id: ""
	I1217 20:34:47.729183  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.729192  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:47.729197  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:47.729258  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:47.753481  420062 cri.go:89] found id: ""
	I1217 20:34:47.753494  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.753501  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:47.753509  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:47.753521  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:47.768175  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:47.768192  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:47.832224  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:47.823643   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.824432   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826211   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826814   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.828509   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:47.823643   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.824432   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826211   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826814   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.828509   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:47.832234  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:47.832264  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:47.894275  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:47.894294  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:47.921621  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:47.921638  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:50.477347  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:50.487837  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:50.487905  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:50.515440  420062 cri.go:89] found id: ""
	I1217 20:34:50.515460  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.515468  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:50.515473  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:50.515545  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:50.542521  420062 cri.go:89] found id: ""
	I1217 20:34:50.542546  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.542553  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:50.542559  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:50.542629  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:50.569586  420062 cri.go:89] found id: ""
	I1217 20:34:50.569600  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.569613  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:50.569618  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:50.569677  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:50.597938  420062 cri.go:89] found id: ""
	I1217 20:34:50.597951  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.597958  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:50.597966  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:50.598024  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:50.627019  420062 cri.go:89] found id: ""
	I1217 20:34:50.627044  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.627052  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:50.627057  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:50.627128  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:50.655921  420062 cri.go:89] found id: ""
	I1217 20:34:50.655948  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.655956  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:50.655962  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:50.656028  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:50.680457  420062 cri.go:89] found id: ""
	I1217 20:34:50.680471  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.680479  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:50.680487  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:50.680502  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:50.742350  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:50.734040   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.734460   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736277   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736697   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.738252   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:50.734040   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.734460   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736277   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736697   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.738252   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:50.742360  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:50.742370  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:50.802977  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:50.802997  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:50.830354  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:50.830370  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:50.887850  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:50.887869  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:53.403065  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:53.413162  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:53.413227  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:53.437500  420062 cri.go:89] found id: ""
	I1217 20:34:53.437513  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.437521  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:53.437526  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:53.437592  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:53.462889  420062 cri.go:89] found id: ""
	I1217 20:34:53.462902  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.462910  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:53.462915  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:53.462972  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:53.493212  420062 cri.go:89] found id: ""
	I1217 20:34:53.493226  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.493234  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:53.493239  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:53.493301  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:53.521829  420062 cri.go:89] found id: ""
	I1217 20:34:53.521844  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.521851  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:53.521857  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:53.521919  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:53.558427  420062 cri.go:89] found id: ""
	I1217 20:34:53.558442  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.558449  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:53.558454  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:53.558513  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:53.583439  420062 cri.go:89] found id: ""
	I1217 20:34:53.583453  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.583460  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:53.583466  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:53.583526  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:53.608693  420062 cri.go:89] found id: ""
	I1217 20:34:53.608707  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.608714  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:53.608722  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:53.608732  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:53.664959  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:53.664980  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:53.679865  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:53.679886  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:53.742568  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:53.733840   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.734623   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736275   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736848   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.738561   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:53.733840   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.734623   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736275   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736848   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.738561   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:53.742579  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:53.742591  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:53.803297  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:53.803317  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:56.335304  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:56.344915  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:56.344977  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:56.368289  420062 cri.go:89] found id: ""
	I1217 20:34:56.368304  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.368312  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:56.368319  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:56.368388  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:56.392693  420062 cri.go:89] found id: ""
	I1217 20:34:56.392707  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.392715  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:56.392721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:56.392782  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:56.419795  420062 cri.go:89] found id: ""
	I1217 20:34:56.419809  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.419825  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:56.419834  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:56.419902  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:56.445038  420062 cri.go:89] found id: ""
	I1217 20:34:56.445052  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.445060  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:56.445065  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:56.445128  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:56.474272  420062 cri.go:89] found id: ""
	I1217 20:34:56.474287  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.474294  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:56.474300  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:56.474366  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:56.507935  420062 cri.go:89] found id: ""
	I1217 20:34:56.507950  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.507957  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:56.507963  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:56.508030  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:56.535999  420062 cri.go:89] found id: ""
	I1217 20:34:56.536012  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.536030  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:56.536039  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:56.536050  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:56.572020  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:56.572037  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:56.628661  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:56.628681  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:56.643833  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:56.643856  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:56.710351  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:56.701895   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.702686   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704396   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704960   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.706438   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:56.701895   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.702686   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704396   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704960   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.706438   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:56.710361  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:56.710380  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:59.273579  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:59.283581  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:59.283645  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:59.309480  420062 cri.go:89] found id: ""
	I1217 20:34:59.309493  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.309500  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:59.309506  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:59.309564  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:59.333365  420062 cri.go:89] found id: ""
	I1217 20:34:59.333378  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.333386  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:59.333391  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:59.333452  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:59.357207  420062 cri.go:89] found id: ""
	I1217 20:34:59.357221  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.357228  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:59.357233  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:59.357298  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:59.381758  420062 cri.go:89] found id: ""
	I1217 20:34:59.381772  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.381781  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:59.381787  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:59.381845  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:59.406750  420062 cri.go:89] found id: ""
	I1217 20:34:59.406764  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.406772  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:59.406777  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:59.406845  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:59.431825  420062 cri.go:89] found id: ""
	I1217 20:34:59.431838  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.431846  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:59.431852  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:59.431913  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:59.458993  420062 cri.go:89] found id: ""
	I1217 20:34:59.459007  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.459014  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:59.459022  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:59.459041  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:59.546381  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:59.527767   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.528143   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.536500   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.537248   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.538811   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:59.527767   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.528143   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.536500   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.537248   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.538811   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:59.546391  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:59.546401  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:59.613987  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:59.614007  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:59.644296  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:59.644311  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:59.703226  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:59.703245  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:02.218783  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:02.229042  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:02.229114  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:02.254286  420062 cri.go:89] found id: ""
	I1217 20:35:02.254300  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.254308  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:02.254315  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:02.254374  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:02.281092  420062 cri.go:89] found id: ""
	I1217 20:35:02.281106  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.281114  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:02.281120  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:02.281198  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:02.310195  420062 cri.go:89] found id: ""
	I1217 20:35:02.310209  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.310217  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:02.310222  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:02.310294  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:02.338807  420062 cri.go:89] found id: ""
	I1217 20:35:02.338821  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.338829  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:02.338834  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:02.338904  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:02.364604  420062 cri.go:89] found id: ""
	I1217 20:35:02.364618  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.364625  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:02.364631  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:02.364693  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:02.389458  420062 cri.go:89] found id: ""
	I1217 20:35:02.389473  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.389481  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:02.389486  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:02.389544  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:02.419120  420062 cri.go:89] found id: ""
	I1217 20:35:02.419134  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.419142  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:02.419151  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:02.419162  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:02.476620  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:02.476640  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:02.492411  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:02.492428  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:02.567285  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:02.558682   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.559341   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.560957   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.561461   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.562999   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:02.558682   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.559341   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.560957   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.561461   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.562999   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:02.567294  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:02.567308  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:02.635002  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:02.635022  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:05.163567  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:05.174184  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:05.174245  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:05.199116  420062 cri.go:89] found id: ""
	I1217 20:35:05.199130  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.199137  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:05.199143  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:05.199206  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:05.223477  420062 cri.go:89] found id: ""
	I1217 20:35:05.223491  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.223498  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:05.223504  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:05.223562  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:05.247303  420062 cri.go:89] found id: ""
	I1217 20:35:05.247317  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.247325  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:05.247332  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:05.247391  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:05.272620  420062 cri.go:89] found id: ""
	I1217 20:35:05.272633  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.272641  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:05.272646  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:05.272703  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:05.300419  420062 cri.go:89] found id: ""
	I1217 20:35:05.300434  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.300441  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:05.300446  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:05.300505  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:05.325851  420062 cri.go:89] found id: ""
	I1217 20:35:05.325866  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.325873  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:05.325879  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:05.325938  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:05.354430  420062 cri.go:89] found id: ""
	I1217 20:35:05.354445  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.354452  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:05.354460  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:05.354475  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:05.369668  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:05.369686  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:05.436390  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:05.427472   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.428087   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.429823   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.430630   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.432463   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:05.427472   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.428087   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.429823   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.430630   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.432463   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:05.436400  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:05.436411  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:05.499177  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:05.499202  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:05.531231  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:05.531248  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:08.088375  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:08.098640  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:08.098711  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:08.132112  420062 cri.go:89] found id: ""
	I1217 20:35:08.132127  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.132136  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:08.132141  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:08.132205  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:08.157778  420062 cri.go:89] found id: ""
	I1217 20:35:08.157792  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.157800  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:08.157805  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:08.157862  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:08.183372  420062 cri.go:89] found id: ""
	I1217 20:35:08.183386  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.183393  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:08.183399  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:08.183457  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:08.208186  420062 cri.go:89] found id: ""
	I1217 20:35:08.208200  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.208207  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:08.208212  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:08.208310  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:08.236181  420062 cri.go:89] found id: ""
	I1217 20:35:08.236195  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.236202  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:08.236207  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:08.236313  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:08.261508  420062 cri.go:89] found id: ""
	I1217 20:35:08.261522  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.261529  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:08.261534  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:08.261593  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:08.286303  420062 cri.go:89] found id: ""
	I1217 20:35:08.286318  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.286325  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:08.286333  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:08.286349  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:08.345547  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:08.345573  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:08.360551  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:08.360568  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:08.424581  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:08.415775   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.416570   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.418257   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.419005   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.420742   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:08.415775   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.416570   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.418257   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.419005   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.420742   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:08.424593  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:08.424606  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:08.489146  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:08.489166  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:11.022570  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:11.034138  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:11.034205  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:11.066795  420062 cri.go:89] found id: ""
	I1217 20:35:11.066810  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.066817  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:11.066825  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:11.066888  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:11.092902  420062 cri.go:89] found id: ""
	I1217 20:35:11.092917  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.092925  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:11.092931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:11.092998  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:11.120040  420062 cri.go:89] found id: ""
	I1217 20:35:11.120056  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.120064  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:11.120069  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:11.120138  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:11.150096  420062 cri.go:89] found id: ""
	I1217 20:35:11.150111  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.150118  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:11.150124  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:11.150186  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:11.178952  420062 cri.go:89] found id: ""
	I1217 20:35:11.178966  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.178973  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:11.178979  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:11.179042  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:11.205194  420062 cri.go:89] found id: ""
	I1217 20:35:11.205208  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.205215  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:11.205221  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:11.205281  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:11.231314  420062 cri.go:89] found id: ""
	I1217 20:35:11.231327  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.231335  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:11.231343  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:11.231355  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:11.246458  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:11.246475  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:11.312684  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:11.304393   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.305171   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.306693   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.307058   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.308710   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:11.304393   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.305171   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.306693   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.307058   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.308710   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:11.312696  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:11.312706  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:11.379354  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:11.379374  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:11.413484  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:11.413500  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:13.972078  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:13.982223  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:13.982290  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:14.022488  420062 cri.go:89] found id: ""
	I1217 20:35:14.022502  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.022510  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:14.022515  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:14.022575  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:14.059328  420062 cri.go:89] found id: ""
	I1217 20:35:14.059342  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.059364  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:14.059369  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:14.059435  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:14.085531  420062 cri.go:89] found id: ""
	I1217 20:35:14.085544  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.085552  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:14.085558  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:14.085616  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:14.114113  420062 cri.go:89] found id: ""
	I1217 20:35:14.114134  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.114141  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:14.114147  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:14.114210  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:14.138505  420062 cri.go:89] found id: ""
	I1217 20:35:14.138519  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.138526  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:14.138532  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:14.138591  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:14.162838  420062 cri.go:89] found id: ""
	I1217 20:35:14.162852  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.162858  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:14.162863  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:14.162923  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:14.190631  420062 cri.go:89] found id: ""
	I1217 20:35:14.190651  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.190665  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:14.190672  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:14.190682  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:14.246544  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:14.246563  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:14.261703  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:14.261719  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:14.327698  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:14.319587   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.320376   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322035   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322354   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.323849   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:14.319587   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.320376   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322035   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322354   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.323849   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:14.327708  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:14.327721  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:14.391616  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:14.391635  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:16.921553  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:16.931542  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:16.931604  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:16.955206  420062 cri.go:89] found id: ""
	I1217 20:35:16.955220  420062 logs.go:282] 0 containers: []
	W1217 20:35:16.955227  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:16.955233  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:16.955291  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:16.984598  420062 cri.go:89] found id: ""
	I1217 20:35:16.984613  420062 logs.go:282] 0 containers: []
	W1217 20:35:16.984620  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:16.984625  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:16.984683  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:17.033712  420062 cri.go:89] found id: ""
	I1217 20:35:17.033726  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.033733  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:17.033739  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:17.033796  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:17.061936  420062 cri.go:89] found id: ""
	I1217 20:35:17.061950  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.061957  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:17.061963  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:17.062023  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:17.086921  420062 cri.go:89] found id: ""
	I1217 20:35:17.086936  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.086943  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:17.086948  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:17.087009  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:17.112474  420062 cri.go:89] found id: ""
	I1217 20:35:17.112488  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.112495  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:17.112501  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:17.112558  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:17.137847  420062 cri.go:89] found id: ""
	I1217 20:35:17.137867  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.137875  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:17.137882  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:17.137892  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:17.198885  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:17.198904  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:17.213637  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:17.213652  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:17.281467  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:17.272943   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.273684   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275273   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275893   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.277419   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:17.272943   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.273684   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275273   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275893   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.277419   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:17.281478  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:17.281488  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:17.343313  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:17.343334  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:19.871984  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:19.882066  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:19.882128  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:19.907664  420062 cri.go:89] found id: ""
	I1217 20:35:19.907678  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.907686  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:19.907691  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:19.907750  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:19.936014  420062 cri.go:89] found id: ""
	I1217 20:35:19.936028  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.936035  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:19.936040  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:19.936099  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:19.961865  420062 cri.go:89] found id: ""
	I1217 20:35:19.961881  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.961888  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:19.961893  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:19.961954  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:19.988749  420062 cri.go:89] found id: ""
	I1217 20:35:19.988762  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.988769  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:19.988775  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:19.988832  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:20.021844  420062 cri.go:89] found id: ""
	I1217 20:35:20.021859  420062 logs.go:282] 0 containers: []
	W1217 20:35:20.021866  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:20.021873  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:20.021936  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:20.064328  420062 cri.go:89] found id: ""
	I1217 20:35:20.064343  420062 logs.go:282] 0 containers: []
	W1217 20:35:20.064351  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:20.064356  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:20.064464  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:20.092230  420062 cri.go:89] found id: ""
	I1217 20:35:20.092244  420062 logs.go:282] 0 containers: []
	W1217 20:35:20.092272  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:20.092280  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:20.092291  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:20.150597  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:20.150617  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:20.166734  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:20.166751  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:20.235344  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:20.226511   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.227342   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.228855   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.229349   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.230876   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:20.226511   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.227342   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.228855   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.229349   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.230876   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:20.235354  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:20.235368  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:20.300971  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:20.300991  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:22.830503  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:22.840565  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:22.840627  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:22.865965  420062 cri.go:89] found id: ""
	I1217 20:35:22.865980  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.865987  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:22.865992  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:22.866051  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:22.890981  420062 cri.go:89] found id: ""
	I1217 20:35:22.890995  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.891002  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:22.891007  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:22.891067  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:22.916050  420062 cri.go:89] found id: ""
	I1217 20:35:22.916064  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.916070  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:22.916075  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:22.916134  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:22.940231  420062 cri.go:89] found id: ""
	I1217 20:35:22.940244  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.940274  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:22.940280  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:22.940338  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:22.964651  420062 cri.go:89] found id: ""
	I1217 20:35:22.964665  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.964673  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:22.964678  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:22.964739  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:22.999102  420062 cri.go:89] found id: ""
	I1217 20:35:22.999118  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.999126  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:22.999133  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:22.999201  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:23.031417  420062 cri.go:89] found id: ""
	I1217 20:35:23.031431  420062 logs.go:282] 0 containers: []
	W1217 20:35:23.031440  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:23.031447  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:23.031458  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:23.099279  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:23.099300  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:23.127896  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:23.127914  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:23.184706  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:23.184725  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:23.199879  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:23.199895  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:23.267184  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:23.258603   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.259294   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.260943   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.261532   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.263117   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:23.258603   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.259294   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.260943   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.261532   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.263117   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:25.768885  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:25.778947  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:25.779017  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:25.802991  420062 cri.go:89] found id: ""
	I1217 20:35:25.803005  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.803025  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:25.803031  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:25.803093  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:25.830724  420062 cri.go:89] found id: ""
	I1217 20:35:25.830738  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.830745  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:25.830751  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:25.830813  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:25.860059  420062 cri.go:89] found id: ""
	I1217 20:35:25.860073  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.860081  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:25.860085  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:25.860150  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:25.896087  420062 cri.go:89] found id: ""
	I1217 20:35:25.896101  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.896108  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:25.896114  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:25.896173  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:25.921891  420062 cri.go:89] found id: ""
	I1217 20:35:25.921905  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.921912  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:25.921918  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:25.921975  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:25.946115  420062 cri.go:89] found id: ""
	I1217 20:35:25.946129  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.946137  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:25.946142  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:25.946199  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:25.970696  420062 cri.go:89] found id: ""
	I1217 20:35:25.970711  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.970719  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:25.970727  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:25.970737  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:26.031476  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:26.031497  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:26.053026  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:26.053044  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:26.121268  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:26.112221   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.113175   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.114729   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.115229   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.116856   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:26.112221   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.113175   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.114729   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.115229   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.116856   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:26.121279  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:26.121290  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:26.183866  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:26.183888  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:28.713125  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:28.723373  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:28.723436  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:28.750204  420062 cri.go:89] found id: ""
	I1217 20:35:28.750218  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.750225  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:28.750231  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:28.750295  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:28.774507  420062 cri.go:89] found id: ""
	I1217 20:35:28.774520  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.774528  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:28.774533  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:28.774593  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:28.799202  420062 cri.go:89] found id: ""
	I1217 20:35:28.799217  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.799225  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:28.799230  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:28.799295  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:28.823894  420062 cri.go:89] found id: ""
	I1217 20:35:28.823908  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.823916  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:28.823921  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:28.823981  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:28.848696  420062 cri.go:89] found id: ""
	I1217 20:35:28.848710  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.848717  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:28.848722  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:28.848780  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:28.874108  420062 cri.go:89] found id: ""
	I1217 20:35:28.874121  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.874129  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:28.874146  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:28.874206  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:28.899607  420062 cri.go:89] found id: ""
	I1217 20:35:28.899621  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.899628  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:28.899636  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:28.899646  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:28.955990  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:28.956010  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:28.970828  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:28.970844  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:29.048596  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:29.039925   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.040773   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042371   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042731   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.044197   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:29.039925   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.040773   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042371   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042731   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.044197   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:29.048606  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:29.048627  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:29.115475  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:29.115495  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:31.644907  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:31.654819  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:31.654879  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:31.678281  420062 cri.go:89] found id: ""
	I1217 20:35:31.678295  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.678303  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:31.678308  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:31.678370  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:31.702902  420062 cri.go:89] found id: ""
	I1217 20:35:31.702916  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.702923  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:31.702929  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:31.702988  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:31.730614  420062 cri.go:89] found id: ""
	I1217 20:35:31.730629  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.730643  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:31.730648  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:31.730715  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:31.757724  420062 cri.go:89] found id: ""
	I1217 20:35:31.757738  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.757745  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:31.757751  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:31.757821  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:31.781313  420062 cri.go:89] found id: ""
	I1217 20:35:31.781326  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.781333  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:31.781338  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:31.781401  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:31.805048  420062 cri.go:89] found id: ""
	I1217 20:35:31.805061  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.805068  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:31.805074  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:31.805133  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:31.829157  420062 cri.go:89] found id: ""
	I1217 20:35:31.829172  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.829178  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:31.829186  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:31.829211  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:31.884232  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:31.884262  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:31.899125  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:31.899143  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:31.960768  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:31.952914   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.953466   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.954986   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.955445   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.957040   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:31.952914   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.953466   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.954986   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.955445   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.957040   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:31.960779  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:31.960789  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:32.026560  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:32.026580  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:34.561956  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:34.573345  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:34.573414  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:34.601971  420062 cri.go:89] found id: ""
	I1217 20:35:34.601985  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.601993  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:34.601998  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:34.602057  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:34.631487  420062 cri.go:89] found id: ""
	I1217 20:35:34.631500  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.631508  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:34.631513  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:34.631572  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:34.656452  420062 cri.go:89] found id: ""
	I1217 20:35:34.656465  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.656473  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:34.656478  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:34.656540  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:34.682582  420062 cri.go:89] found id: ""
	I1217 20:35:34.682596  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.682603  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:34.682609  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:34.682676  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:34.713925  420062 cri.go:89] found id: ""
	I1217 20:35:34.713939  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.713947  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:34.713952  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:34.714017  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:34.742385  420062 cri.go:89] found id: ""
	I1217 20:35:34.742400  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.742408  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:34.742414  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:34.742473  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:34.767035  420062 cri.go:89] found id: ""
	I1217 20:35:34.767049  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.767056  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:34.767064  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:34.767075  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:34.822796  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:34.822817  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:34.837590  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:34.837613  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:34.900508  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:34.892940   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.893576   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895113   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895412   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.896864   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:34.892940   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.893576   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895113   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895412   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.896864   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:34.900518  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:34.900529  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:34.962881  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:34.962905  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:37.494984  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:37.505451  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:37.505514  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:37.530852  420062 cri.go:89] found id: ""
	I1217 20:35:37.530866  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.530874  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:37.530885  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:37.530948  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:37.555283  420062 cri.go:89] found id: ""
	I1217 20:35:37.555298  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.555305  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:37.555319  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:37.555384  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:37.580310  420062 cri.go:89] found id: ""
	I1217 20:35:37.580324  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.580342  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:37.580347  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:37.580407  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:37.604561  420062 cri.go:89] found id: ""
	I1217 20:35:37.604575  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.604582  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:37.604587  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:37.604649  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:37.633577  420062 cri.go:89] found id: ""
	I1217 20:35:37.633591  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.633598  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:37.633603  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:37.633668  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:37.659137  420062 cri.go:89] found id: ""
	I1217 20:35:37.659152  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.659159  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:37.659183  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:37.659280  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:37.687689  420062 cri.go:89] found id: ""
	I1217 20:35:37.687704  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.687711  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:37.687719  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:37.687738  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:37.742459  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:37.742478  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:37.757175  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:37.757191  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:37.822005  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:37.813077   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.813679   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.815702   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.816474   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.817981   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:37.813077   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.813679   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.815702   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.816474   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.817981   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:37.822015  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:37.822025  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:37.885848  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:37.885870  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:40.416602  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:40.427031  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:40.427099  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:40.452190  420062 cri.go:89] found id: ""
	I1217 20:35:40.452204  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.452212  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:40.452218  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:40.452299  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:40.478942  420062 cri.go:89] found id: ""
	I1217 20:35:40.478956  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.478963  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:40.478969  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:40.479027  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:40.504873  420062 cri.go:89] found id: ""
	I1217 20:35:40.504886  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.504893  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:40.504898  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:40.504958  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:40.530153  420062 cri.go:89] found id: ""
	I1217 20:35:40.530167  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.530173  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:40.530179  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:40.530239  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:40.558703  420062 cri.go:89] found id: ""
	I1217 20:35:40.558717  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.558725  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:40.558731  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:40.558799  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:40.583753  420062 cri.go:89] found id: ""
	I1217 20:35:40.583768  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.583777  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:40.583793  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:40.583856  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:40.608061  420062 cri.go:89] found id: ""
	I1217 20:35:40.608075  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.608083  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:40.608099  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:40.608111  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:40.665201  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:40.665222  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:40.680290  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:40.680307  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:40.752424  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:40.739302   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.740073   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.745453   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.746616   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.748372   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:40.739302   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.740073   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.745453   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.746616   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.748372   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:40.752435  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:40.752446  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:40.819510  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:40.819535  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:43.356404  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:43.367228  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:43.367293  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:43.391809  420062 cri.go:89] found id: ""
	I1217 20:35:43.391824  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.391831  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:43.391836  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:43.391895  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:43.417869  420062 cri.go:89] found id: ""
	I1217 20:35:43.417883  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.417890  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:43.417895  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:43.417959  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:43.443314  420062 cri.go:89] found id: ""
	I1217 20:35:43.443328  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.443335  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:43.443340  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:43.443400  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:43.469332  420062 cri.go:89] found id: ""
	I1217 20:35:43.469346  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.469352  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:43.469358  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:43.469418  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:43.494242  420062 cri.go:89] found id: ""
	I1217 20:35:43.494256  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.494264  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:43.494277  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:43.494341  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:43.520502  420062 cri.go:89] found id: ""
	I1217 20:35:43.520515  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.520523  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:43.520529  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:43.520592  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:43.549390  420062 cri.go:89] found id: ""
	I1217 20:35:43.549404  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.549411  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:43.549419  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:43.549435  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:43.565708  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:43.565725  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:43.633544  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:43.624678   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.625383   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627234   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627820   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.629497   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:43.624678   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.625383   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627234   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627820   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.629497   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:43.633555  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:43.633567  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:43.696433  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:43.696457  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:43.727227  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:43.727244  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:46.288373  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:46.298318  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:46.298381  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:46.322903  420062 cri.go:89] found id: ""
	I1217 20:35:46.322918  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.322925  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:46.322931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:46.322992  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:46.347241  420062 cri.go:89] found id: ""
	I1217 20:35:46.347253  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.347260  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:46.347265  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:46.347324  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:46.372209  420062 cri.go:89] found id: ""
	I1217 20:35:46.372222  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.372229  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:46.372235  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:46.372313  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:46.399343  420062 cri.go:89] found id: ""
	I1217 20:35:46.399357  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.399365  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:46.399370  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:46.399430  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:46.425023  420062 cri.go:89] found id: ""
	I1217 20:35:46.425036  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.425051  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:46.425057  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:46.425119  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:46.450066  420062 cri.go:89] found id: ""
	I1217 20:35:46.450080  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.450087  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:46.450092  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:46.450153  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:46.474598  420062 cri.go:89] found id: ""
	I1217 20:35:46.474612  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.474619  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:46.474644  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:46.474654  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:46.536781  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:46.536801  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:46.570140  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:46.570155  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:46.628870  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:46.628888  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:46.643875  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:46.643891  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:46.709883  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:46.701485   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.702111   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.703801   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.704133   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.705726   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:46.701485   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.702111   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.703801   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.704133   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.705726   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:49.210139  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:49.220394  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:49.220461  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:49.256343  420062 cri.go:89] found id: ""
	I1217 20:35:49.256358  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.256365  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:49.256370  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:49.256431  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:49.290171  420062 cri.go:89] found id: ""
	I1217 20:35:49.290185  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.290193  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:49.290198  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:49.290261  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:49.320916  420062 cri.go:89] found id: ""
	I1217 20:35:49.320931  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.320939  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:49.320944  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:49.321003  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:49.345394  420062 cri.go:89] found id: ""
	I1217 20:35:49.345408  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.345415  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:49.345421  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:49.345478  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:49.370339  420062 cri.go:89] found id: ""
	I1217 20:35:49.370353  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.370360  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:49.370365  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:49.370424  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:49.394642  420062 cri.go:89] found id: ""
	I1217 20:35:49.394656  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.394663  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:49.394668  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:49.394734  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:49.422548  420062 cri.go:89] found id: ""
	I1217 20:35:49.422562  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.422569  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:49.422577  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:49.422594  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:49.479225  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:49.479246  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:49.494238  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:49.494255  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:49.560086  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:49.552332   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.552825   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554311   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554738   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.556232   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:49.552332   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.552825   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554311   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554738   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.556232   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:49.560096  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:49.560106  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:49.622094  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:49.622114  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:52.150210  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:52.160168  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:52.160231  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:52.184746  420062 cri.go:89] found id: ""
	I1217 20:35:52.184760  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.184767  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:52.184779  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:52.184835  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:52.209501  420062 cri.go:89] found id: ""
	I1217 20:35:52.209515  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.209522  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:52.209528  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:52.209586  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:52.234558  420062 cri.go:89] found id: ""
	I1217 20:35:52.234571  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.234579  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:52.234584  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:52.234654  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:52.265703  420062 cri.go:89] found id: ""
	I1217 20:35:52.265716  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.265724  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:52.265729  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:52.265794  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:52.297248  420062 cri.go:89] found id: ""
	I1217 20:35:52.297263  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.297270  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:52.297275  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:52.297334  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:52.325342  420062 cri.go:89] found id: ""
	I1217 20:35:52.325355  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.325362  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:52.325367  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:52.325433  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:52.349812  420062 cri.go:89] found id: ""
	I1217 20:35:52.349826  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.349843  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:52.349851  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:52.349862  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:52.380735  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:52.380751  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:52.436131  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:52.436151  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:52.451427  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:52.451445  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:52.518482  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:52.509497   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.510168   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.512343   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.513295   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.514564   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:52.509497   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.510168   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.512343   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.513295   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.514564   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:52.518492  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:52.518503  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:55.081073  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:55.091720  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:55.091797  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:55.117311  420062 cri.go:89] found id: ""
	I1217 20:35:55.117325  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.117333  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:55.117338  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:55.117398  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:55.141668  420062 cri.go:89] found id: ""
	I1217 20:35:55.141683  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.141692  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:55.141697  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:55.141760  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:55.166517  420062 cri.go:89] found id: ""
	I1217 20:35:55.166534  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.166541  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:55.166546  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:55.166611  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:55.191282  420062 cri.go:89] found id: ""
	I1217 20:35:55.191296  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.191304  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:55.191309  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:55.191369  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:55.215605  420062 cri.go:89] found id: ""
	I1217 20:35:55.215619  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.215626  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:55.215631  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:55.215690  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:55.247101  420062 cri.go:89] found id: ""
	I1217 20:35:55.247124  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.247132  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:55.247137  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:55.247205  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:55.288704  420062 cri.go:89] found id: ""
	I1217 20:35:55.288718  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.288725  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:55.288732  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:55.288743  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:55.320382  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:55.320398  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:55.379997  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:55.380016  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:55.394762  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:55.394780  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:55.459997  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:55.451538   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.452219   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.453851   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.454661   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.456164   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:55.451538   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.452219   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.453851   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.454661   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.456164   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:55.460007  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:55.460018  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:58.024408  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:58.035410  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:58.035478  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:58.062124  420062 cri.go:89] found id: ""
	I1217 20:35:58.062138  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.062145  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:58.062151  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:58.062211  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:58.088229  420062 cri.go:89] found id: ""
	I1217 20:35:58.088243  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.088270  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:58.088276  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:58.088335  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:58.113240  420062 cri.go:89] found id: ""
	I1217 20:35:58.113255  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.113261  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:58.113266  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:58.113325  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:58.141811  420062 cri.go:89] found id: ""
	I1217 20:35:58.141825  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.141832  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:58.141837  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:58.141897  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:58.170463  420062 cri.go:89] found id: ""
	I1217 20:35:58.170477  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.170484  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:58.170490  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:58.170548  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:58.194647  420062 cri.go:89] found id: ""
	I1217 20:35:58.194670  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.194678  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:58.194684  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:58.194760  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:58.219714  420062 cri.go:89] found id: ""
	I1217 20:35:58.219728  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.219735  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:58.219743  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:58.219754  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:58.263178  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:58.263194  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:58.325412  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:58.325433  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:58.341419  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:58.341435  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:58.403135  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:58.394644   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.395273   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.396931   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.397587   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.399184   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:58.394644   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.395273   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.396931   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.397587   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.399184   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:58.403147  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:58.403163  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:00.965498  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:00.975759  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:00.975820  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:01.000786  420062 cri.go:89] found id: ""
	I1217 20:36:01.000803  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.000811  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:01.000818  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:01.000892  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:01.025695  420062 cri.go:89] found id: ""
	I1217 20:36:01.025709  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.025716  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:01.025721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:01.025784  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:01.054712  420062 cri.go:89] found id: ""
	I1217 20:36:01.054727  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.054734  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:01.054739  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:01.054799  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:01.083318  420062 cri.go:89] found id: ""
	I1217 20:36:01.083332  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.083340  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:01.083345  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:01.083406  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:01.107939  420062 cri.go:89] found id: ""
	I1217 20:36:01.107954  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.107962  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:01.107968  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:01.108030  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:01.134926  420062 cri.go:89] found id: ""
	I1217 20:36:01.134940  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.134947  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:01.134954  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:01.135018  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:01.161095  420062 cri.go:89] found id: ""
	I1217 20:36:01.161111  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.161121  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:01.161130  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:01.161141  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:01.222094  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:01.222112  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:01.239432  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:01.239449  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:01.331243  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:01.322562   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.323102   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.324888   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.325430   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.327051   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:01.322562   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.323102   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.324888   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.325430   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.327051   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:01.331254  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:01.331265  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:01.398128  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:01.398148  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:03.929660  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:03.940045  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:03.940111  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:03.963644  420062 cri.go:89] found id: ""
	I1217 20:36:03.963658  420062 logs.go:282] 0 containers: []
	W1217 20:36:03.963665  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:03.963670  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:03.963727  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:03.996893  420062 cri.go:89] found id: ""
	I1217 20:36:03.996907  420062 logs.go:282] 0 containers: []
	W1217 20:36:03.996914  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:03.996919  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:03.996987  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:04.028499  420062 cri.go:89] found id: ""
	I1217 20:36:04.028514  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.028530  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:04.028535  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:04.028607  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:04.054700  420062 cri.go:89] found id: ""
	I1217 20:36:04.054715  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.054723  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:04.054728  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:04.054785  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:04.082040  420062 cri.go:89] found id: ""
	I1217 20:36:04.082054  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.082063  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:04.082068  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:04.082131  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:04.107015  420062 cri.go:89] found id: ""
	I1217 20:36:04.107029  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.107037  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:04.107043  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:04.107109  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:04.134634  420062 cri.go:89] found id: ""
	I1217 20:36:04.134648  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.134655  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:04.134663  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:04.134673  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:04.191059  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:04.191079  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:04.206280  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:04.206298  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:04.297698  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:04.288551   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.289379   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.290529   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.291295   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.292969   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:04.288551   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.289379   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.290529   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.291295   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.292969   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:04.297708  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:04.297719  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:04.364378  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:04.364398  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:06.892149  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:06.902353  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:06.902418  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:06.927834  420062 cri.go:89] found id: ""
	I1217 20:36:06.927847  420062 logs.go:282] 0 containers: []
	W1217 20:36:06.927855  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:06.927860  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:06.927925  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:06.952936  420062 cri.go:89] found id: ""
	I1217 20:36:06.952949  420062 logs.go:282] 0 containers: []
	W1217 20:36:06.952956  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:06.952965  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:06.953024  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:06.976184  420062 cri.go:89] found id: ""
	I1217 20:36:06.976198  420062 logs.go:282] 0 containers: []
	W1217 20:36:06.976205  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:06.976210  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:06.976297  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:07.004079  420062 cri.go:89] found id: ""
	I1217 20:36:07.004093  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.004101  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:07.004106  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:07.004167  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:07.029604  420062 cri.go:89] found id: ""
	I1217 20:36:07.029618  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.029625  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:07.029630  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:07.029698  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:07.058618  420062 cri.go:89] found id: ""
	I1217 20:36:07.058637  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.058645  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:07.058650  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:07.058709  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:07.085932  420062 cri.go:89] found id: ""
	I1217 20:36:07.085946  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.085953  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:07.085961  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:07.085972  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:07.100543  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:07.100561  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:07.162557  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:07.154011   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.154703   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.156341   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.157015   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.158662   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:07.154011   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.154703   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.156341   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.157015   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.158662   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:07.162567  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:07.162578  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:07.226244  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:07.226265  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:07.280558  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:07.280574  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:09.844282  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:09.854593  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:09.854676  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:09.883180  420062 cri.go:89] found id: ""
	I1217 20:36:09.883194  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.883202  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:09.883208  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:09.883268  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:09.907225  420062 cri.go:89] found id: ""
	I1217 20:36:09.907240  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.907248  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:09.907254  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:09.907315  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:09.936079  420062 cri.go:89] found id: ""
	I1217 20:36:09.936093  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.936100  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:09.936105  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:09.936167  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:09.961921  420062 cri.go:89] found id: ""
	I1217 20:36:09.961935  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.961943  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:09.961949  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:09.962028  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:09.989285  420062 cri.go:89] found id: ""
	I1217 20:36:09.989299  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.989307  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:09.989312  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:09.989371  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:10.023888  420062 cri.go:89] found id: ""
	I1217 20:36:10.023905  420062 logs.go:282] 0 containers: []
	W1217 20:36:10.023913  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:10.023920  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:10.023992  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:10.056062  420062 cri.go:89] found id: ""
	I1217 20:36:10.056077  420062 logs.go:282] 0 containers: []
	W1217 20:36:10.056084  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:10.056102  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:10.056112  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:10.118144  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:10.118165  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:10.153504  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:10.153521  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:10.209909  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:10.209931  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:10.224930  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:10.224946  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:10.310457  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:10.302878   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.303301   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304492   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304881   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.306456   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:10.302878   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.303301   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304492   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304881   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.306456   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:12.811296  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:12.821279  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:12.821339  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:12.845496  420062 cri.go:89] found id: ""
	I1217 20:36:12.845510  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.845519  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:12.845524  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:12.845582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:12.873951  420062 cri.go:89] found id: ""
	I1217 20:36:12.873966  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.873973  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:12.873978  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:12.874039  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:12.898560  420062 cri.go:89] found id: ""
	I1217 20:36:12.898573  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.898580  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:12.898586  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:12.898661  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:12.931323  420062 cri.go:89] found id: ""
	I1217 20:36:12.931343  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.931350  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:12.931356  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:12.931416  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:12.957667  420062 cri.go:89] found id: ""
	I1217 20:36:12.957680  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.957687  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:12.957692  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:12.957749  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:12.981848  420062 cri.go:89] found id: ""
	I1217 20:36:12.981863  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.981870  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:12.981876  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:12.981934  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:13.007649  420062 cri.go:89] found id: ""
	I1217 20:36:13.007664  420062 logs.go:282] 0 containers: []
	W1217 20:36:13.007671  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:13.007679  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:13.007689  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:13.070827  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:13.070846  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:13.098938  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:13.098954  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:13.155232  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:13.155253  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:13.170218  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:13.170234  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:13.237601  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:13.228684   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.229296   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.230990   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.231505   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.233204   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:13.228684   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.229296   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.230990   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.231505   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.233204   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:15.739451  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:15.749635  420062 kubeadm.go:602] duration metric: took 4m4.768391835s to restartPrimaryControlPlane
	W1217 20:36:15.749706  420062 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1217 20:36:15.749781  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1217 20:36:16.165425  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1217 20:36:16.179463  420062 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1217 20:36:16.187987  420062 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1217 20:36:16.188041  420062 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 20:36:16.195805  420062 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1217 20:36:16.195815  420062 kubeadm.go:158] found existing configuration files:
	
	I1217 20:36:16.195868  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1217 20:36:16.203578  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1217 20:36:16.203633  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1217 20:36:16.211222  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1217 20:36:16.218882  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1217 20:36:16.218939  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 20:36:16.226500  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1217 20:36:16.233980  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1217 20:36:16.234040  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 20:36:16.241486  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1217 20:36:16.250121  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1217 20:36:16.250177  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 20:36:16.257963  420062 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1217 20:36:16.296719  420062 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1217 20:36:16.297028  420062 kubeadm.go:319] [preflight] Running pre-flight checks
	I1217 20:36:16.367021  420062 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1217 20:36:16.367085  420062 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1217 20:36:16.367119  420062 kubeadm.go:319] OS: Linux
	I1217 20:36:16.367163  420062 kubeadm.go:319] CGROUPS_CPU: enabled
	I1217 20:36:16.367211  420062 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1217 20:36:16.367257  420062 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1217 20:36:16.367304  420062 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1217 20:36:16.367351  420062 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1217 20:36:16.367397  420062 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1217 20:36:16.367441  420062 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1217 20:36:16.367493  420062 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1217 20:36:16.367539  420062 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1217 20:36:16.443855  420062 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1217 20:36:16.443958  420062 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1217 20:36:16.444047  420062 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1217 20:36:16.456800  420062 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1217 20:36:16.459720  420062 out.go:252]   - Generating certificates and keys ...
	I1217 20:36:16.459808  420062 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1217 20:36:16.459875  420062 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1217 20:36:16.459957  420062 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1217 20:36:16.460026  420062 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1217 20:36:16.460100  420062 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1217 20:36:16.460156  420062 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1217 20:36:16.460222  420062 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1217 20:36:16.460299  420062 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1217 20:36:16.460377  420062 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1217 20:36:16.460454  420062 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1217 20:36:16.460493  420062 kubeadm.go:319] [certs] Using the existing "sa" key
	I1217 20:36:16.460552  420062 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1217 20:36:16.591707  420062 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1217 20:36:16.773515  420062 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1217 20:36:16.895942  420062 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1217 20:36:17.316963  420062 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1217 20:36:17.418134  420062 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1217 20:36:17.418872  420062 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1217 20:36:17.421748  420062 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1217 20:36:17.424898  420062 out.go:252]   - Booting up control plane ...
	I1217 20:36:17.424999  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1217 20:36:17.425075  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1217 20:36:17.425522  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1217 20:36:17.446706  420062 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1217 20:36:17.446809  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1217 20:36:17.455830  420062 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1217 20:36:17.455925  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1217 20:36:17.455963  420062 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1217 20:36:17.596746  420062 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1217 20:36:17.596869  420062 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1217 20:40:17.595000  420062 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000220112s
	I1217 20:40:17.595032  420062 kubeadm.go:319] 
	I1217 20:40:17.595086  420062 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1217 20:40:17.595116  420062 kubeadm.go:319] 	- The kubelet is not running
	I1217 20:40:17.595215  420062 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1217 20:40:17.595220  420062 kubeadm.go:319] 
	I1217 20:40:17.595317  420062 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1217 20:40:17.595346  420062 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1217 20:40:17.595375  420062 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1217 20:40:17.595378  420062 kubeadm.go:319] 
	I1217 20:40:17.599582  420062 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1217 20:40:17.600077  420062 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1217 20:40:17.600181  420062 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1217 20:40:17.600461  420062 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1217 20:40:17.600468  420062 kubeadm.go:319] 
	I1217 20:40:17.600540  420062 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1217 20:40:17.600694  420062 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000220112s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1217 20:40:17.600780  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1217 20:40:18.014309  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1217 20:40:18.029681  420062 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1217 20:40:18.029742  420062 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 20:40:18.038728  420062 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1217 20:40:18.038739  420062 kubeadm.go:158] found existing configuration files:
	
	I1217 20:40:18.038796  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1217 20:40:18.047726  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1217 20:40:18.047785  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1217 20:40:18.056139  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1217 20:40:18.064964  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1217 20:40:18.065020  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 20:40:18.073071  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1217 20:40:18.081347  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1217 20:40:18.081407  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 20:40:18.089386  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1217 20:40:18.097546  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1217 20:40:18.097608  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 20:40:18.105445  420062 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1217 20:40:18.146508  420062 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1217 20:40:18.146883  420062 kubeadm.go:319] [preflight] Running pre-flight checks
	I1217 20:40:18.223079  420062 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1217 20:40:18.223139  420062 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1217 20:40:18.223171  420062 kubeadm.go:319] OS: Linux
	I1217 20:40:18.223212  420062 kubeadm.go:319] CGROUPS_CPU: enabled
	I1217 20:40:18.223257  420062 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1217 20:40:18.223306  420062 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1217 20:40:18.223354  420062 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1217 20:40:18.223398  420062 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1217 20:40:18.223442  420062 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1217 20:40:18.223484  420062 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1217 20:40:18.223529  420062 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1217 20:40:18.223571  420062 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1217 20:40:18.290116  420062 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1217 20:40:18.290214  420062 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1217 20:40:18.290297  420062 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1217 20:40:18.296827  420062 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1217 20:40:18.300313  420062 out.go:252]   - Generating certificates and keys ...
	I1217 20:40:18.300404  420062 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1217 20:40:18.300483  420062 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1217 20:40:18.300564  420062 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1217 20:40:18.300623  420062 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1217 20:40:18.300692  420062 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1217 20:40:18.300745  420062 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1217 20:40:18.300806  420062 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1217 20:40:18.300867  420062 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1217 20:40:18.300940  420062 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1217 20:40:18.301011  420062 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1217 20:40:18.301047  420062 kubeadm.go:319] [certs] Using the existing "sa" key
	I1217 20:40:18.301101  420062 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1217 20:40:18.651136  420062 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1217 20:40:18.865861  420062 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1217 20:40:19.156184  420062 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1217 20:40:19.613234  420062 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1217 20:40:19.777874  420062 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1217 20:40:19.778689  420062 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1217 20:40:19.781521  420062 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1217 20:40:19.784636  420062 out.go:252]   - Booting up control plane ...
	I1217 20:40:19.784726  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1217 20:40:19.784798  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1217 20:40:19.786110  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1217 20:40:19.806173  420062 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1217 20:40:19.806463  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1217 20:40:19.814039  420062 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1217 20:40:19.814294  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1217 20:40:19.814465  420062 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1217 20:40:19.960654  420062 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1217 20:40:19.960777  420062 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1217 20:44:19.954818  420062 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001239508s
	I1217 20:44:19.954843  420062 kubeadm.go:319] 
	I1217 20:44:19.954896  420062 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1217 20:44:19.954927  420062 kubeadm.go:319] 	- The kubelet is not running
	I1217 20:44:19.955102  420062 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1217 20:44:19.955108  420062 kubeadm.go:319] 
	I1217 20:44:19.955205  420062 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1217 20:44:19.955233  420062 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1217 20:44:19.955262  420062 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1217 20:44:19.955265  420062 kubeadm.go:319] 
	I1217 20:44:19.960153  420062 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1217 20:44:19.960582  420062 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1217 20:44:19.960689  420062 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1217 20:44:19.960924  420062 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1217 20:44:19.960929  420062 kubeadm.go:319] 
	I1217 20:44:19.960996  420062 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1217 20:44:19.961048  420062 kubeadm.go:403] duration metric: took 12m9.01968184s to StartCluster
	I1217 20:44:19.961079  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:44:19.961139  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:44:19.999166  420062 cri.go:89] found id: ""
	I1217 20:44:19.999182  420062 logs.go:282] 0 containers: []
	W1217 20:44:19.999190  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:44:19.999195  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:44:19.999265  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:44:20.031203  420062 cri.go:89] found id: ""
	I1217 20:44:20.031218  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.031225  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:44:20.031230  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:44:20.031293  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:44:20.061179  420062 cri.go:89] found id: ""
	I1217 20:44:20.061193  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.061200  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:44:20.061219  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:44:20.061280  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:44:20.089093  420062 cri.go:89] found id: ""
	I1217 20:44:20.089107  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.089114  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:44:20.089120  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:44:20.089183  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:44:20.119683  420062 cri.go:89] found id: ""
	I1217 20:44:20.119696  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.119704  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:44:20.119709  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:44:20.119772  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:44:20.145500  420062 cri.go:89] found id: ""
	I1217 20:44:20.145514  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.145521  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:44:20.145526  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:44:20.145586  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:44:20.170345  420062 cri.go:89] found id: ""
	I1217 20:44:20.170359  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.170367  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:44:20.170377  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:44:20.170387  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:44:20.226476  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:44:20.226496  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:44:20.241970  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:44:20.241987  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:44:20.311525  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:44:20.302109   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.302950   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.304712   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.305374   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.307049   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:44:20.302109   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.302950   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.304712   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.305374   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.307049   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:44:20.311535  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:44:20.311546  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:44:20.375759  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:44:20.375781  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1217 20:44:20.404823  420062 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001239508s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1217 20:44:20.404857  420062 out.go:285] * 
	W1217 20:44:20.404931  420062 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001239508s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1217 20:44:20.404948  420062 out.go:285] * 
	W1217 20:44:20.407052  420062 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1217 20:44:20.412138  420062 out.go:203] 
	W1217 20:44:20.415946  420062 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001239508s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1217 20:44:20.415994  420062 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1217 20:44:20.416018  420062 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1217 20:44:20.419093  420062 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304459447Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304532998Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304632437Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304709099Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304775544Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304836714Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304892469Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304951784Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.305023309Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.305106805Z" level=info msg="Connect containerd service"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.305473562Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.306163145Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.318314045Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.318400322Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.318427285Z" level=info msg="Start subscribing containerd event"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.318481078Z" level=info msg="Start recovering state"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358031279Z" level=info msg="Start event monitor"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358217808Z" level=info msg="Start cni network conf syncer for default"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358291688Z" level=info msg="Start streaming server"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358359291Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358415021Z" level=info msg="runtime interface starting up..."
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358467600Z" level=info msg="starting plugins..."
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358529204Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 17 20:32:09 functional-682596 systemd[1]: Started containerd.service - containerd container runtime.
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.361000854Z" level=info msg="containerd successfully booted in 0.082346s"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:44:21.609734   21180 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:21.610337   21180 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:21.612075   21180 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:21.612627   21180 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:21.614277   21180 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec17 17:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015536] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.514164] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034184] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.806183] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.649674] kauditd_printk_skb: 36 callbacks suppressed
	[Dec17 19:37] hrtimer: interrupt took 15014583 ns
	[Dec17 19:39] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:06] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:17] FS-Cache: Duplicate cookie detected
	[  +0.000767] FS-Cache: O-cookie c=00000031 [p=00000002 fl=222 nc=0 na=1]
	[  +0.001036] FS-Cache: O-cookie d=00000000b1f70094{9P.session} n=000000004124fba5
	[  +0.001177] FS-Cache: O-key=[10] '34323937353834383437'
	[  +0.000816] FS-Cache: N-cookie c=00000032 [p=00000002 fl=2 nc=0 na=1]
	[  +0.001043] FS-Cache: N-cookie d=00000000b1f70094{9P.session} n=000000009cece4cf
	[  +0.001160] FS-Cache: N-key=[10] '34323937353834383437'
	
	
	==> kernel <==
	 20:44:21 up  3:26,  0 user,  load average: 0.56, 0.25, 0.47
	Linux functional-682596 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 17 20:44:18 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:44:19 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 319.
	Dec 17 20:44:19 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:19 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:19 functional-682596 kubelet[20986]: E1217 20:44:19.287566   20986 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:44:19 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:44:19 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:44:19 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 320.
	Dec 17 20:44:19 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:19 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:20 functional-682596 kubelet[20996]: E1217 20:44:20.060446   20996 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:44:20 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:44:20 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:44:20 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 321.
	Dec 17 20:44:20 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:20 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:20 functional-682596 kubelet[21091]: E1217 20:44:20.740286   21091 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:44:20 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:44:20 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:44:21 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 322.
	Dec 17 20:44:21 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:21 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:21 functional-682596 kubelet[21166]: E1217 20:44:21.541596   21166 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:44:21 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:44:21 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596: exit status 2 (356.454012ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-682596" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig (736.12s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth (2.16s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth
functional_test.go:825: (dbg) Run:  kubectl --context functional-682596 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:825: (dbg) Non-zero exit: kubectl --context functional-682596 get po -l tier=control-plane -n kube-system -o=json: exit status 1 (63.997296ms)

                                                
                                                
** stderr ** 
	E1217 20:44:22.521123  432175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:22.522707  432175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:22.524124  432175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:22.526111  432175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:22.527520  432175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:827: failed to get components. args "kubectl --context functional-682596 get po -l tier=control-plane -n kube-system -o=json": exit status 1
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-682596
helpers_test.go:244: (dbg) docker inspect functional-682596:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	        "Created": "2025-12-17T20:17:26.774929696Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 408854,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-17T20:17:26.844564666Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:2a6398fc76fc21dc0a77ac54600c2604c101bff52e66ecf65f88ec0f1a8cff2d",
	        "ResolvConfPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hostname",
	        "HostsPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hosts",
	        "LogPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77-json.log",
	        "Name": "/functional-682596",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-682596:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-682596",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	                "LowerDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268-init/diff:/var/lib/docker/overlay2/83c8e6311894730d80a5439b5d4991744e9cfa6d0015df9caca346d57baf92e8/diff",
	                "MergedDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/merged",
	                "UpperDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/diff",
	                "WorkDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-682596",
	                "Source": "/var/lib/docker/volumes/functional-682596/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-682596",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-682596",
	                "name.minikube.sigs.k8s.io": "functional-682596",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "8e0f8d4915f888f90df7adb000bd0e749885d304e33053e85751193487b627b9",
	            "SandboxKey": "/var/run/docker/netns/8e0f8d4915f8",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33163"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33164"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33167"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33165"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33166"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-682596": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "de:95:c1:d9:d4:32",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "9e66e4dbc8284f728f81715f37c51d8272e96fcac9fb378874c982b3077b6cc2",
	                    "EndpointID": "0db3c56cfb2be75c981ed53adcc07de7cd33db60d51c01b0e875c8d41cf02897",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-682596",
	                        "efc9468a7e55"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596: exit status 2 (301.11631ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                         ARGS                                                                          │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image   │ functional-032730 image ls --format short --alsologtostderr                                                                                           │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image   │ functional-032730 image ls --format json --alsologtostderr                                                                                            │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image   │ functional-032730 image ls --format table --alsologtostderr                                                                                           │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ ssh     │ functional-032730 ssh pgrep buildkitd                                                                                                                 │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │                     │
	│ image   │ functional-032730 image build -t localhost/my-image:functional-032730 testdata/build --alsologtostderr                                                │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ image   │ functional-032730 image ls                                                                                                                            │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ delete  │ -p functional-032730                                                                                                                                  │ functional-032730 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │ 17 Dec 25 20:17 UTC │
	│ start   │ -p functional-682596 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1 │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:17 UTC │                     │
	│ start   │ -p functional-682596 --alsologtostderr -v=8                                                                                                           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:25 UTC │                     │
	│ cache   │ functional-682596 cache add registry.k8s.io/pause:3.1                                                                                                 │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ functional-682596 cache add registry.k8s.io/pause:3.3                                                                                                 │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ functional-682596 cache add registry.k8s.io/pause:latest                                                                                              │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ functional-682596 cache add minikube-local-cache-test:functional-682596                                                                               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ functional-682596 cache delete minikube-local-cache-test:functional-682596                                                                            │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                      │ minikube          │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ cache   │ list                                                                                                                                                  │ minikube          │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ ssh     │ functional-682596 ssh sudo crictl images                                                                                                              │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ ssh     │ functional-682596 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                    │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ ssh     │ functional-682596 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │                     │
	│ cache   │ functional-682596 cache reload                                                                                                                        │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ ssh     │ functional-682596 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                      │ minikube          │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                   │ minikube          │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ kubectl │ functional-682596 kubectl -- --context functional-682596 get pods                                                                                     │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │                     │
	│ start   │ -p functional-682596 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all                                              │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │                     │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/17 20:32:06
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1217 20:32:06.395598  420062 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:32:06.395704  420062 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:32:06.395708  420062 out.go:374] Setting ErrFile to fd 2...
	I1217 20:32:06.395712  420062 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:32:06.395972  420062 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:32:06.396388  420062 out.go:368] Setting JSON to false
	I1217 20:32:06.397206  420062 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":11672,"bootTime":1765991855,"procs":154,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:32:06.397266  420062 start.go:143] virtualization:  
	I1217 20:32:06.400889  420062 out.go:179] * [functional-682596] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1217 20:32:06.403953  420062 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 20:32:06.404019  420062 notify.go:221] Checking for updates...
	I1217 20:32:06.410244  420062 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:32:06.413231  420062 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:32:06.416152  420062 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:32:06.419145  420062 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 20:32:06.422186  420062 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 20:32:06.425355  420062 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:32:06.425444  420062 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:32:06.459431  420062 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:32:06.459555  420062 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:32:06.531840  420062 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-17 20:32:06.520070933 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:32:06.531937  420062 docker.go:319] overlay module found
	I1217 20:32:06.535075  420062 out.go:179] * Using the docker driver based on existing profile
	I1217 20:32:06.538013  420062 start.go:309] selected driver: docker
	I1217 20:32:06.538025  420062 start.go:927] validating driver "docker" against &{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableC
oreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:32:06.538123  420062 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 20:32:06.538239  420062 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:32:06.599898  420062 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-17 20:32:06.590438982 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:32:06.600362  420062 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1217 20:32:06.600387  420062 cni.go:84] Creating CNI manager for ""
	I1217 20:32:06.600439  420062 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:32:06.600480  420062 start.go:353] cluster config:
	{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCo
reDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:32:06.605529  420062 out.go:179] * Starting "functional-682596" primary control-plane node in "functional-682596" cluster
	I1217 20:32:06.608314  420062 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1217 20:32:06.611190  420062 out.go:179] * Pulling base image v0.0.48-1765661130-22141 ...
	I1217 20:32:06.614228  420062 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:32:06.614282  420062 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1217 20:32:06.614283  420062 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon
	I1217 20:32:06.614291  420062 cache.go:65] Caching tarball of preloaded images
	I1217 20:32:06.614394  420062 preload.go:238] Found /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1217 20:32:06.614404  420062 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1217 20:32:06.614527  420062 profile.go:143] Saving config to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/config.json ...
	I1217 20:32:06.634867  420062 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon, skipping pull
	I1217 20:32:06.634879  420062 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 exists in daemon, skipping load
	I1217 20:32:06.634892  420062 cache.go:243] Successfully downloaded all kic artifacts
	I1217 20:32:06.634927  420062 start.go:360] acquireMachinesLock for functional-682596: {Name:mk49b95a4c72eb2d15a1ae0f35918a9843d0b3df Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1217 20:32:06.634983  420062 start.go:364] duration metric: took 39.828µs to acquireMachinesLock for "functional-682596"
	I1217 20:32:06.635002  420062 start.go:96] Skipping create...Using existing machine configuration
	I1217 20:32:06.635007  420062 fix.go:54] fixHost starting: 
	I1217 20:32:06.635262  420062 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:32:06.652755  420062 fix.go:112] recreateIfNeeded on functional-682596: state=Running err=<nil>
	W1217 20:32:06.652776  420062 fix.go:138] unexpected machine state, will restart: <nil>
	I1217 20:32:06.656001  420062 out.go:252] * Updating the running docker "functional-682596" container ...
	I1217 20:32:06.656027  420062 machine.go:94] provisionDockerMachine start ...
	I1217 20:32:06.656117  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:06.673371  420062 main.go:143] libmachine: Using SSH client type: native
	I1217 20:32:06.673711  420062 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:32:06.673717  420062 main.go:143] libmachine: About to run SSH command:
	hostname
	I1217 20:32:06.807817  420062 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:32:06.807832  420062 ubuntu.go:182] provisioning hostname "functional-682596"
	I1217 20:32:06.807905  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:06.825970  420062 main.go:143] libmachine: Using SSH client type: native
	I1217 20:32:06.826266  420062 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:32:06.826274  420062 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-682596 && echo "functional-682596" | sudo tee /etc/hostname
	I1217 20:32:06.965026  420062 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:32:06.965108  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:06.983394  420062 main.go:143] libmachine: Using SSH client type: native
	I1217 20:32:06.983695  420062 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:32:06.983710  420062 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-682596' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-682596/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-682596' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1217 20:32:07.116833  420062 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1217 20:32:07.116850  420062 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21808-367595/.minikube CaCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21808-367595/.minikube}
	I1217 20:32:07.116869  420062 ubuntu.go:190] setting up certificates
	I1217 20:32:07.116877  420062 provision.go:84] configureAuth start
	I1217 20:32:07.116947  420062 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:32:07.134531  420062 provision.go:143] copyHostCerts
	I1217 20:32:07.134601  420062 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem, removing ...
	I1217 20:32:07.134608  420062 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem
	I1217 20:32:07.134696  420062 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem (1082 bytes)
	I1217 20:32:07.134816  420062 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem, removing ...
	I1217 20:32:07.134820  420062 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem
	I1217 20:32:07.134849  420062 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem (1123 bytes)
	I1217 20:32:07.134907  420062 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem, removing ...
	I1217 20:32:07.134911  420062 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem
	I1217 20:32:07.134937  420062 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem (1679 bytes)
	I1217 20:32:07.134994  420062 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem org=jenkins.functional-682596 san=[127.0.0.1 192.168.49.2 functional-682596 localhost minikube]
	I1217 20:32:07.402222  420062 provision.go:177] copyRemoteCerts
	I1217 20:32:07.402275  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1217 20:32:07.402313  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.421789  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.516787  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1217 20:32:07.535734  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1217 20:32:07.553569  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1217 20:32:07.572193  420062 provision.go:87] duration metric: took 455.301945ms to configureAuth
	I1217 20:32:07.572211  420062 ubuntu.go:206] setting minikube options for container-runtime
	I1217 20:32:07.572513  420062 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:32:07.572520  420062 machine.go:97] duration metric: took 916.488302ms to provisionDockerMachine
	I1217 20:32:07.572527  420062 start.go:293] postStartSetup for "functional-682596" (driver="docker")
	I1217 20:32:07.572544  420062 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1217 20:32:07.572595  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1217 20:32:07.572635  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.593078  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.688373  420062 ssh_runner.go:195] Run: cat /etc/os-release
	I1217 20:32:07.691957  420062 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1217 20:32:07.691978  420062 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1217 20:32:07.691989  420062 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/addons for local assets ...
	I1217 20:32:07.692044  420062 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/files for local assets ...
	I1217 20:32:07.692122  420062 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> 3694612.pem in /etc/ssl/certs
	I1217 20:32:07.692197  420062 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts -> hosts in /etc/test/nested/copy/369461
	I1217 20:32:07.692238  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/369461
	I1217 20:32:07.699873  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:32:07.718147  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts --> /etc/test/nested/copy/369461/hosts (40 bytes)
	I1217 20:32:07.736089  420062 start.go:296] duration metric: took 163.546649ms for postStartSetup
	I1217 20:32:07.736163  420062 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1217 20:32:07.736210  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.753837  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.845496  420062 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1217 20:32:07.850448  420062 fix.go:56] duration metric: took 1.215434362s for fixHost
	I1217 20:32:07.850463  420062 start.go:83] releasing machines lock for "functional-682596", held for 1.215473649s
	I1217 20:32:07.850551  420062 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:32:07.871450  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:32:07.871498  420062 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:32:07.871505  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:32:07.871531  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:32:07.871602  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:32:07.871627  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:32:07.871680  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:32:07.871748  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:32:07.871798  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.889554  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.998672  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:32:08.024673  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:32:08.048014  420062 ssh_runner.go:195] Run: openssl version
	I1217 20:32:08.055454  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.065155  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:32:08.073391  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.077720  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.077778  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.119356  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:32:08.127518  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.135465  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:32:08.143207  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.147322  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.147376  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.188376  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:32:08.196028  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.203401  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:32:08.211111  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.214821  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.214891  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.256072  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:32:08.263331  420062 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-certificates >/dev/null 2>&1 && sudo update-ca-certificates || true"
	I1217 20:32:08.266724  420062 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-trust >/dev/null 2>&1 && sudo update-ca-trust extract || true"
	I1217 20:32:08.270040  420062 ssh_runner.go:195] Run: cat /version.json
	I1217 20:32:08.270111  420062 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1217 20:32:08.361093  420062 ssh_runner.go:195] Run: systemctl --version
	I1217 20:32:08.367706  420062 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1217 20:32:08.372063  420062 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1217 20:32:08.372127  420062 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1217 20:32:08.380119  420062 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1217 20:32:08.380133  420062 start.go:496] detecting cgroup driver to use...
	I1217 20:32:08.380163  420062 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1217 20:32:08.380223  420062 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1217 20:32:08.395765  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1217 20:32:08.409064  420062 docker.go:218] disabling cri-docker service (if available) ...
	I1217 20:32:08.409142  420062 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1217 20:32:08.425141  420062 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1217 20:32:08.438808  420062 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1217 20:32:08.558555  420062 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1217 20:32:08.681937  420062 docker.go:234] disabling docker service ...
	I1217 20:32:08.681997  420062 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1217 20:32:08.701323  420062 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1217 20:32:08.715923  420062 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1217 20:32:08.835610  420062 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1217 20:32:08.958372  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1217 20:32:08.972822  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1217 20:32:08.987570  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1217 20:32:08.997169  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1217 20:32:09.008742  420062 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1217 20:32:09.008821  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1217 20:32:09.018997  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:32:09.028318  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1217 20:32:09.037280  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:32:09.046375  420062 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1217 20:32:09.054925  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1217 20:32:09.064191  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1217 20:32:09.073303  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1217 20:32:09.082553  420062 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1217 20:32:09.090003  420062 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1217 20:32:09.097524  420062 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:32:09.216967  420062 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1217 20:32:09.360558  420062 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1217 20:32:09.360617  420062 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1217 20:32:09.364443  420062 start.go:564] Will wait 60s for crictl version
	I1217 20:32:09.364497  420062 ssh_runner.go:195] Run: which crictl
	I1217 20:32:09.368129  420062 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1217 20:32:09.397262  420062 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1217 20:32:09.397334  420062 ssh_runner.go:195] Run: containerd --version
	I1217 20:32:09.420778  420062 ssh_runner.go:195] Run: containerd --version
	I1217 20:32:09.446347  420062 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1217 20:32:09.449338  420062 cli_runner.go:164] Run: docker network inspect functional-682596 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1217 20:32:09.466521  420062 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1217 20:32:09.473221  420062 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1217 20:32:09.476024  420062 kubeadm.go:884] updating cluster {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMi
rror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1217 20:32:09.476173  420062 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:32:09.476285  420062 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:32:09.523837  420062 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:32:09.523848  420062 containerd.go:534] Images already preloaded, skipping extraction
	I1217 20:32:09.523905  420062 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:32:09.551003  420062 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:32:09.551014  420062 cache_images.go:86] Images are preloaded, skipping loading
	I1217 20:32:09.551021  420062 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1217 20:32:09.551143  420062 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-682596 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1217 20:32:09.551208  420062 ssh_runner.go:195] Run: sudo crictl info
	I1217 20:32:09.578643  420062 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1217 20:32:09.578665  420062 cni.go:84] Creating CNI manager for ""
	I1217 20:32:09.578673  420062 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:32:09.578683  420062 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1217 20:32:09.578707  420062 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-682596 NodeName:functional-682596 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubelet
ConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1217 20:32:09.578827  420062 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-682596"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1217 20:32:09.578904  420062 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1217 20:32:09.586879  420062 binaries.go:51] Found k8s binaries, skipping transfer
	I1217 20:32:09.586939  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1217 20:32:09.594505  420062 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1217 20:32:09.607281  420062 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1217 20:32:09.619808  420062 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2085 bytes)
	I1217 20:32:09.632685  420062 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1217 20:32:09.636364  420062 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:32:09.746796  420062 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1217 20:32:10.238623  420062 certs.go:69] Setting up /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596 for IP: 192.168.49.2
	I1217 20:32:10.238634  420062 certs.go:195] generating shared ca certs ...
	I1217 20:32:10.238650  420062 certs.go:227] acquiring lock for ca certs: {Name:mk528c7ee25f2f3d78de33f266a77f908cb2a9d0 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:32:10.238819  420062 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key
	I1217 20:32:10.238897  420062 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key
	I1217 20:32:10.238904  420062 certs.go:257] generating profile certs ...
	I1217 20:32:10.238995  420062 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key
	I1217 20:32:10.239044  420062 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key.0c30bf8d
	I1217 20:32:10.239082  420062 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key
	I1217 20:32:10.239190  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:32:10.239221  420062 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:32:10.239227  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:32:10.239261  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:32:10.239282  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:32:10.239304  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:32:10.239345  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:32:10.239934  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1217 20:32:10.261870  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1217 20:32:10.286466  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1217 20:32:10.307033  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1217 20:32:10.325172  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1217 20:32:10.343499  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1217 20:32:10.361814  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1217 20:32:10.379595  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1217 20:32:10.397590  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:32:10.415855  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:32:10.435021  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:32:10.453267  420062 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1217 20:32:10.466474  420062 ssh_runner.go:195] Run: openssl version
	I1217 20:32:10.472863  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.480366  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:32:10.487904  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.491724  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.491791  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.533110  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:32:10.540758  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.548093  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:32:10.555384  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.558983  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.559039  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.602447  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:32:10.609962  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.617251  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:32:10.625102  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.629186  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.629244  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.670572  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:32:10.678295  420062 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1217 20:32:10.682347  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1217 20:32:10.723286  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1217 20:32:10.764614  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1217 20:32:10.806369  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1217 20:32:10.856829  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1217 20:32:10.900136  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1217 20:32:10.941380  420062 kubeadm.go:401] StartCluster: {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirro
r: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:32:10.941458  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1217 20:32:10.941532  420062 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1217 20:32:10.973304  420062 cri.go:89] found id: ""
	I1217 20:32:10.973369  420062 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1217 20:32:10.981213  420062 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1217 20:32:10.981233  420062 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1217 20:32:10.981284  420062 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1217 20:32:10.989643  420062 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:10.990148  420062 kubeconfig.go:125] found "functional-682596" server: "https://192.168.49.2:8441"
	I1217 20:32:10.991404  420062 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1217 20:32:11.001770  420062 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-17 20:17:35.203485302 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-17 20:32:09.624537089 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1217 20:32:11.001793  420062 kubeadm.go:1161] stopping kube-system containers ...
	I1217 20:32:11.001810  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1217 20:32:11.001907  420062 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1217 20:32:11.031815  420062 cri.go:89] found id: ""
	I1217 20:32:11.031894  420062 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1217 20:32:11.052689  420062 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 20:32:11.061497  420062 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5635 Dec 17 20:21 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5640 Dec 17 20:21 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Dec 17 20:21 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Dec 17 20:21 /etc/kubernetes/scheduler.conf
	
	I1217 20:32:11.061561  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1217 20:32:11.069861  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1217 20:32:11.077903  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:11.077964  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 20:32:11.085969  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1217 20:32:11.094098  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:11.094177  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 20:32:11.102002  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1217 20:32:11.110213  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:11.110288  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 20:32:11.119148  420062 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1217 20:32:11.127567  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:11.176595  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.173518  420062 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.996897383s)
	I1217 20:32:13.173578  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.380045  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.450955  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.494559  420062 api_server.go:52] waiting for apiserver process to appear ...
	I1217 20:32:13.494629  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:13.995499  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:14.495246  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:14.995004  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:15.494932  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:15.995036  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:16.495074  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:16.994872  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:17.495380  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:17.995751  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:18.495343  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:18.994970  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:19.494770  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:19.994830  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:20.495505  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:20.994898  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:21.495023  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:21.995478  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:22.495349  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:22.995690  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:23.495439  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:23.995543  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:24.495694  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:24.995422  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:25.495295  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:25.994704  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:26.495710  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:26.995337  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:27.494832  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:27.995523  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:28.494851  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:28.995537  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:29.495464  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:29.994938  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:30.494723  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:30.995506  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:31.494922  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:31.995021  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:32.495513  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:32.995616  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:33.494819  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:33.995255  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:34.495487  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:34.994841  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:35.494829  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:35.994738  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:36.495064  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:36.995222  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:37.495670  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:37.995598  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:38.495022  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:38.994778  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:39.494800  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:39.995546  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:40.495339  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:40.995490  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:41.495730  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:41.995344  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:42.494837  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:42.994782  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:43.495499  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:43.994789  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:44.495147  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:44.994920  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:45.495463  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:45.994922  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:46.495042  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:46.994829  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:47.495629  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:47.994850  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:48.495359  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:48.994705  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:49.494785  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:49.995746  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:50.495699  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:50.994838  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:51.494890  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:51.995223  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:52.495608  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:52.995342  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:53.495633  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:53.994828  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:54.495690  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:54.995411  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:55.495390  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:55.994857  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:56.494814  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:56.995195  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:57.494792  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:57.995068  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:58.494828  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:58.995135  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:59.495101  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:59.994696  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:00.494847  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:00.994832  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:01.495150  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:01.994869  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:02.494983  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:02.995441  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:03.495150  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:03.994800  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:04.494955  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:04.995595  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:05.495571  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:05.995745  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:06.494913  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:06.994802  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:07.494809  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:07.995731  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:08.495034  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:08.995352  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:09.494830  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:09.995574  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:10.495663  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:10.995478  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:11.494754  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:11.995704  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:12.494787  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:12.995364  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:13.495637  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:13.495716  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:13.520703  420062 cri.go:89] found id: ""
	I1217 20:33:13.520717  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.520724  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:13.520729  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:13.520793  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:13.549658  420062 cri.go:89] found id: ""
	I1217 20:33:13.549672  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.549680  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:13.549685  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:13.549748  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:13.574860  420062 cri.go:89] found id: ""
	I1217 20:33:13.574873  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.574880  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:13.574885  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:13.574945  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:13.602159  420062 cri.go:89] found id: ""
	I1217 20:33:13.602173  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.602180  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:13.602185  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:13.602244  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:13.625735  420062 cri.go:89] found id: ""
	I1217 20:33:13.625748  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.625755  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:13.625760  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:13.625816  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:13.650446  420062 cri.go:89] found id: ""
	I1217 20:33:13.650460  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.650468  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:13.650473  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:13.650533  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:13.677915  420062 cri.go:89] found id: ""
	I1217 20:33:13.677929  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.677936  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:13.677944  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:13.677954  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:13.692434  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:13.692449  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:13.767790  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:13.758832   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.759470   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.761607   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.762393   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.763960   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:13.758832   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.759470   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.761607   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.762393   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.763960   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:13.767810  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:13.767820  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:13.839665  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:13.839685  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:13.872573  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:13.872589  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:16.429115  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:16.438989  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:16.439051  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:16.466518  420062 cri.go:89] found id: ""
	I1217 20:33:16.466532  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.466539  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:16.466545  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:16.466602  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:16.492200  420062 cri.go:89] found id: ""
	I1217 20:33:16.492213  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.492221  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:16.492226  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:16.492302  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:16.517055  420062 cri.go:89] found id: ""
	I1217 20:33:16.517070  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.517083  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:16.517088  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:16.517148  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:16.552138  420062 cri.go:89] found id: ""
	I1217 20:33:16.552152  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.552159  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:16.552165  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:16.552235  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:16.577184  420062 cri.go:89] found id: ""
	I1217 20:33:16.577198  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.577214  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:16.577220  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:16.577279  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:16.602039  420062 cri.go:89] found id: ""
	I1217 20:33:16.602053  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.602060  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:16.602066  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:16.602124  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:16.626732  420062 cri.go:89] found id: ""
	I1217 20:33:16.626745  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.626752  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:16.626760  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:16.626770  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:16.689454  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:16.689473  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:16.722345  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:16.722363  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:16.784686  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:16.784705  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:16.801895  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:16.801911  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:16.865697  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:16.856899   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.857554   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859279   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859924   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.861707   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:16.856899   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.857554   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859279   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859924   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.861707   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:19.365915  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:19.375998  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:19.376066  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:19.399955  420062 cri.go:89] found id: ""
	I1217 20:33:19.399968  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.399976  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:19.399981  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:19.400039  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:19.424668  420062 cri.go:89] found id: ""
	I1217 20:33:19.424682  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.424689  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:19.424695  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:19.424755  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:19.449865  420062 cri.go:89] found id: ""
	I1217 20:33:19.449879  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.449886  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:19.449891  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:19.449958  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:19.474803  420062 cri.go:89] found id: ""
	I1217 20:33:19.474816  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.474833  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:19.474838  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:19.474909  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:19.503551  420062 cri.go:89] found id: ""
	I1217 20:33:19.503579  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.503598  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:19.503603  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:19.503687  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:19.529232  420062 cri.go:89] found id: ""
	I1217 20:33:19.529246  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.529259  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:19.529264  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:19.529330  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:19.554443  420062 cri.go:89] found id: ""
	I1217 20:33:19.554456  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.554463  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:19.554481  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:19.554491  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:19.609391  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:19.609411  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:19.625653  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:19.625669  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:19.691445  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:19.683737   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.684184   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.685768   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.686180   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.687608   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:19.683737   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.684184   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.685768   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.686180   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.687608   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:19.691456  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:19.691466  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:19.754663  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:19.754682  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:22.297725  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:22.309139  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:22.309199  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:22.334369  420062 cri.go:89] found id: ""
	I1217 20:33:22.334382  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.334390  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:22.334395  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:22.334458  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:22.363418  420062 cri.go:89] found id: ""
	I1217 20:33:22.363445  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.363453  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:22.363458  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:22.363531  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:22.388924  420062 cri.go:89] found id: ""
	I1217 20:33:22.388939  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.388947  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:22.388993  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:22.389056  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:22.415757  420062 cri.go:89] found id: ""
	I1217 20:33:22.415780  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.415787  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:22.415793  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:22.415872  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:22.441520  420062 cri.go:89] found id: ""
	I1217 20:33:22.441534  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.441541  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:22.441546  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:22.441605  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:22.480775  420062 cri.go:89] found id: ""
	I1217 20:33:22.480789  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.480795  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:22.480801  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:22.480873  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:22.505556  420062 cri.go:89] found id: ""
	I1217 20:33:22.505570  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.505577  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:22.505585  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:22.505596  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:22.562036  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:22.562054  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:22.577369  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:22.577386  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:22.647423  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:22.638838   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.639486   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641272   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641956   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.643602   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:22.638838   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.639486   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641272   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641956   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.643602   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:22.647453  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:22.647464  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:22.710153  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:22.710173  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:25.239783  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:25.250945  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:25.251006  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:25.277422  420062 cri.go:89] found id: ""
	I1217 20:33:25.277435  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.277443  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:25.277448  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:25.277510  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:25.303032  420062 cri.go:89] found id: ""
	I1217 20:33:25.303051  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.303063  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:25.303070  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:25.303176  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:25.333183  420062 cri.go:89] found id: ""
	I1217 20:33:25.333197  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.333204  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:25.333209  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:25.333272  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:25.358899  420062 cri.go:89] found id: ""
	I1217 20:33:25.358913  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.358920  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:25.358926  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:25.358986  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:25.388611  420062 cri.go:89] found id: ""
	I1217 20:33:25.388625  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.388633  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:25.388638  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:25.388704  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:25.415829  420062 cri.go:89] found id: ""
	I1217 20:33:25.415844  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.415852  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:25.415857  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:25.415913  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:25.442921  420062 cri.go:89] found id: ""
	I1217 20:33:25.442935  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.442941  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:25.442949  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:25.442965  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:25.459113  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:25.459135  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:25.535629  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:25.526636   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.527172   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.528838   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.529443   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.530989   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:25.526636   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.527172   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.528838   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.529443   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.530989   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:25.535645  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:25.535655  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:25.601950  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:25.601968  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:25.634192  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:25.634208  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:28.190569  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:28.200504  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:28.200563  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:28.224311  420062 cri.go:89] found id: ""
	I1217 20:33:28.224325  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.224332  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:28.224338  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:28.224396  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:28.252603  420062 cri.go:89] found id: ""
	I1217 20:33:28.252622  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.252629  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:28.252634  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:28.252692  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:28.276684  420062 cri.go:89] found id: ""
	I1217 20:33:28.276697  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.276704  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:28.276709  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:28.276777  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:28.299922  420062 cri.go:89] found id: ""
	I1217 20:33:28.299935  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.299942  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:28.299947  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:28.300014  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:28.326124  420062 cri.go:89] found id: ""
	I1217 20:33:28.326137  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.326144  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:28.326150  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:28.326218  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:28.349497  420062 cri.go:89] found id: ""
	I1217 20:33:28.349510  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.349517  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:28.349523  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:28.349579  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:28.378156  420062 cri.go:89] found id: ""
	I1217 20:33:28.378170  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.378177  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:28.378185  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:28.378194  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:28.434254  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:28.434274  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:28.448810  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:28.448837  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:28.521268  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:28.512905   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.513656   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515366   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515890   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.517404   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:28.512905   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.513656   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515366   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515890   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.517404   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:28.521279  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:28.521290  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:28.584201  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:28.584222  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:31.112699  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:31.123315  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:31.123377  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:31.151761  420062 cri.go:89] found id: ""
	I1217 20:33:31.151776  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.151783  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:31.151789  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:31.151849  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:31.177165  420062 cri.go:89] found id: ""
	I1217 20:33:31.177178  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.177186  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:31.177191  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:31.177262  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:31.205229  420062 cri.go:89] found id: ""
	I1217 20:33:31.205260  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.205267  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:31.205272  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:31.205341  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:31.229570  420062 cri.go:89] found id: ""
	I1217 20:33:31.229584  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.229591  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:31.229597  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:31.229673  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:31.258880  420062 cri.go:89] found id: ""
	I1217 20:33:31.258904  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.258911  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:31.258917  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:31.258983  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:31.286222  420062 cri.go:89] found id: ""
	I1217 20:33:31.286241  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.286248  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:31.286253  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:31.286315  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:31.311291  420062 cri.go:89] found id: ""
	I1217 20:33:31.311314  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.311322  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:31.311330  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:31.311340  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:31.342524  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:31.342541  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:31.398421  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:31.398440  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:31.413476  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:31.413497  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:31.478376  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:31.469734   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.470537   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472118   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472657   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.474358   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:31.469734   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.470537   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472118   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472657   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.474358   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:31.478388  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:31.478398  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:34.044394  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:34.054571  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:34.054632  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:34.078791  420062 cri.go:89] found id: ""
	I1217 20:33:34.078815  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.078822  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:34.078827  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:34.078902  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:34.103484  420062 cri.go:89] found id: ""
	I1217 20:33:34.103498  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.103505  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:34.103510  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:34.103578  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:34.128330  420062 cri.go:89] found id: ""
	I1217 20:33:34.128343  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.128362  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:34.128368  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:34.128436  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:34.156115  420062 cri.go:89] found id: ""
	I1217 20:33:34.156129  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.156136  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:34.156141  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:34.156208  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:34.179862  420062 cri.go:89] found id: ""
	I1217 20:33:34.179876  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.179884  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:34.179889  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:34.179959  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:34.205717  420062 cri.go:89] found id: ""
	I1217 20:33:34.205731  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.205739  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:34.205745  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:34.205804  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:34.230674  420062 cri.go:89] found id: ""
	I1217 20:33:34.230689  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.230702  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:34.230710  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:34.230720  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:34.286930  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:34.286949  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:34.301786  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:34.301803  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:34.365439  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:34.357724   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.358190   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.359660   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.360034   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.361429   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:34.357724   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.358190   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.359660   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.360034   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.361429   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:34.365461  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:34.365473  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:34.426703  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:34.426724  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:36.954941  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:36.964889  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:36.964949  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:37.000981  420062 cri.go:89] found id: ""
	I1217 20:33:37.000999  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.001008  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:37.001014  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:37.001098  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:37.036987  420062 cri.go:89] found id: ""
	I1217 20:33:37.037001  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.037008  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:37.037013  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:37.037083  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:37.067078  420062 cri.go:89] found id: ""
	I1217 20:33:37.067092  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.067099  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:37.067105  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:37.067173  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:37.101494  420062 cri.go:89] found id: ""
	I1217 20:33:37.101509  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.101516  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:37.101522  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:37.101582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:37.125577  420062 cri.go:89] found id: ""
	I1217 20:33:37.125591  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.125599  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:37.125604  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:37.125672  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:37.155006  420062 cri.go:89] found id: ""
	I1217 20:33:37.155022  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.155040  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:37.155045  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:37.155105  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:37.180061  420062 cri.go:89] found id: ""
	I1217 20:33:37.180075  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.180082  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:37.180090  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:37.180110  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:37.235716  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:37.235744  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:37.250676  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:37.250704  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:37.314789  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:37.307219   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.307729   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309210   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309555   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.311019   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:37.307219   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.307729   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309210   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309555   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.311019   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:37.314799  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:37.314811  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:37.376546  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:37.376566  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:39.904036  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:39.914146  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:39.914209  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:39.942353  420062 cri.go:89] found id: ""
	I1217 20:33:39.942366  420062 logs.go:282] 0 containers: []
	W1217 20:33:39.942374  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:39.942379  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:39.942445  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:39.970090  420062 cri.go:89] found id: ""
	I1217 20:33:39.970105  420062 logs.go:282] 0 containers: []
	W1217 20:33:39.970113  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:39.970119  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:39.970185  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:40.013204  420062 cri.go:89] found id: ""
	I1217 20:33:40.013220  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.013228  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:40.013234  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:40.013312  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:40.055438  420062 cri.go:89] found id: ""
	I1217 20:33:40.055453  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.055461  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:40.055467  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:40.055532  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:40.088240  420062 cri.go:89] found id: ""
	I1217 20:33:40.088285  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.088293  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:40.088298  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:40.088361  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:40.116666  420062 cri.go:89] found id: ""
	I1217 20:33:40.116680  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.116687  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:40.116693  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:40.116752  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:40.143935  420062 cri.go:89] found id: ""
	I1217 20:33:40.143951  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.143965  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:40.143973  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:40.143986  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:40.199464  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:40.199484  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:40.214665  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:40.214682  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:40.285603  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:40.277391   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.277927   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.279526   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.280079   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.281668   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:40.277391   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.277927   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.279526   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.280079   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.281668   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:40.285613  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:40.285623  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:40.348551  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:40.348571  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:42.882366  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:42.892346  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:42.892407  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:42.917526  420062 cri.go:89] found id: ""
	I1217 20:33:42.917540  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.917548  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:42.917553  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:42.917622  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:42.941649  420062 cri.go:89] found id: ""
	I1217 20:33:42.941663  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.941670  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:42.941675  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:42.941737  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:42.965314  420062 cri.go:89] found id: ""
	I1217 20:33:42.965328  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.965335  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:42.965341  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:42.965399  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:42.992861  420062 cri.go:89] found id: ""
	I1217 20:33:42.992875  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.992882  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:42.992888  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:42.992949  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:43.026962  420062 cri.go:89] found id: ""
	I1217 20:33:43.026977  420062 logs.go:282] 0 containers: []
	W1217 20:33:43.026984  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:43.026989  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:43.027048  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:43.056268  420062 cri.go:89] found id: ""
	I1217 20:33:43.056282  420062 logs.go:282] 0 containers: []
	W1217 20:33:43.056289  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:43.056295  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:43.056353  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:43.088527  420062 cri.go:89] found id: ""
	I1217 20:33:43.088542  420062 logs.go:282] 0 containers: []
	W1217 20:33:43.088549  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:43.088556  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:43.088567  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:43.115028  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:43.115044  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:43.170239  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:43.170258  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:43.185453  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:43.185468  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:43.255155  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:43.247293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.247760   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249636   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.251132   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:43.247293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.247760   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249636   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.251132   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:43.255166  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:43.255176  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:45.818750  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:45.829020  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:45.829084  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:45.854296  420062 cri.go:89] found id: ""
	I1217 20:33:45.854310  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.854319  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:45.854327  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:45.854393  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:45.884706  420062 cri.go:89] found id: ""
	I1217 20:33:45.884720  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.884728  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:45.884733  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:45.884795  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:45.909518  420062 cri.go:89] found id: ""
	I1217 20:33:45.909533  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.909540  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:45.909545  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:45.909615  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:45.935050  420062 cri.go:89] found id: ""
	I1217 20:33:45.935065  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.935073  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:45.935078  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:45.935155  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:45.964622  420062 cri.go:89] found id: ""
	I1217 20:33:45.964636  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.964643  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:45.964648  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:45.964714  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:45.992340  420062 cri.go:89] found id: ""
	I1217 20:33:45.992355  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.992363  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:45.992368  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:45.992432  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:46.029800  420062 cri.go:89] found id: ""
	I1217 20:33:46.029815  420062 logs.go:282] 0 containers: []
	W1217 20:33:46.029822  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:46.029841  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:46.029852  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:46.096203  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:46.096224  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:46.111499  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:46.111517  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:46.174259  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:46.165992   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.166754   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168484   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168848   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.170379   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:46.165992   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.166754   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168484   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168848   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.170379   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:46.174269  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:46.174282  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:46.239891  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:46.239911  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:48.769726  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:48.779731  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:48.779796  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:48.803697  420062 cri.go:89] found id: ""
	I1217 20:33:48.803710  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.803718  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:48.803723  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:48.803790  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:48.828947  420062 cri.go:89] found id: ""
	I1217 20:33:48.828966  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.828974  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:48.828979  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:48.829045  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:48.853794  420062 cri.go:89] found id: ""
	I1217 20:33:48.853809  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.853815  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:48.853821  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:48.853884  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:48.879220  420062 cri.go:89] found id: ""
	I1217 20:33:48.879234  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.879241  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:48.879253  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:48.879316  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:48.905546  420062 cri.go:89] found id: ""
	I1217 20:33:48.905560  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.905567  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:48.905573  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:48.905639  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:48.931025  420062 cri.go:89] found id: ""
	I1217 20:33:48.931040  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.931047  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:48.931053  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:48.931111  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:48.959554  420062 cri.go:89] found id: ""
	I1217 20:33:48.959567  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.959575  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:48.959591  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:48.959603  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:49.037548  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:49.028333   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.029097   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.030655   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.031218   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.033613   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:49.028333   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.029097   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.030655   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.031218   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.033613   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:49.037558  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:49.037576  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:49.104606  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:49.104628  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:49.132120  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:49.132142  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:49.189781  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:49.189799  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:51.705313  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:51.715310  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:51.715375  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:51.742788  420062 cri.go:89] found id: ""
	I1217 20:33:51.742803  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.742810  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:51.742816  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:51.742878  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:51.768132  420062 cri.go:89] found id: ""
	I1217 20:33:51.768147  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.768154  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:51.768160  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:51.768220  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:51.796803  420062 cri.go:89] found id: ""
	I1217 20:33:51.796817  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.796825  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:51.796831  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:51.796891  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:51.823032  420062 cri.go:89] found id: ""
	I1217 20:33:51.823046  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.823054  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:51.823061  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:51.823122  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:51.848750  420062 cri.go:89] found id: ""
	I1217 20:33:51.848765  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.848773  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:51.848778  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:51.848840  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:51.874494  420062 cri.go:89] found id: ""
	I1217 20:33:51.874509  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.874516  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:51.874522  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:51.874582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:51.912240  420062 cri.go:89] found id: ""
	I1217 20:33:51.912273  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.912281  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:51.912290  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:51.912301  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:51.940881  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:51.940897  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:51.997574  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:51.997596  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:52.016000  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:52.016018  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:52.093264  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:52.084701   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.085399   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087055   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087666   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.089311   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:52.084701   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.085399   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087055   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087666   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.089311   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:52.093274  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:52.093286  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:54.657449  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:54.667679  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:54.667741  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:54.696106  420062 cri.go:89] found id: ""
	I1217 20:33:54.696121  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.696128  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:54.696133  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:54.696194  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:54.720578  420062 cri.go:89] found id: ""
	I1217 20:33:54.720592  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.720599  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:54.720605  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:54.720669  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:54.746036  420062 cri.go:89] found id: ""
	I1217 20:33:54.746050  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.746058  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:54.746063  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:54.746122  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:54.770192  420062 cri.go:89] found id: ""
	I1217 20:33:54.770206  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.770213  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:54.770219  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:54.770275  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:54.794365  420062 cri.go:89] found id: ""
	I1217 20:33:54.794379  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.794386  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:54.794391  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:54.794454  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:54.818424  420062 cri.go:89] found id: ""
	I1217 20:33:54.818438  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.818446  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:54.818451  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:54.818513  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:54.843360  420062 cri.go:89] found id: ""
	I1217 20:33:54.843375  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.843382  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:54.843401  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:54.843412  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:54.872684  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:54.872701  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:54.928831  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:54.928851  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:54.943545  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:54.943561  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:55.020697  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:55.008146   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.009058   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.010012   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011180   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011994   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:55.008146   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.009058   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.010012   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011180   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011994   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:55.020721  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:55.020734  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:57.590507  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:57.600840  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:57.600911  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:57.628650  420062 cri.go:89] found id: ""
	I1217 20:33:57.628664  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.628671  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:57.628676  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:57.628736  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:57.653915  420062 cri.go:89] found id: ""
	I1217 20:33:57.653929  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.653936  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:57.653941  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:57.654005  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:57.677881  420062 cri.go:89] found id: ""
	I1217 20:33:57.677894  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.677901  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:57.677906  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:57.677974  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:57.701808  420062 cri.go:89] found id: ""
	I1217 20:33:57.701823  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.701830  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:57.701836  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:57.701894  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:57.725682  420062 cri.go:89] found id: ""
	I1217 20:33:57.725696  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.725703  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:57.725708  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:57.725770  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:57.753864  420062 cri.go:89] found id: ""
	I1217 20:33:57.753878  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.753885  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:57.753891  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:57.753948  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:57.779180  420062 cri.go:89] found id: ""
	I1217 20:33:57.779193  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.779200  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:57.779216  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:57.779227  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:57.834554  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:57.834575  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:57.849468  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:57.849484  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:57.917796  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:57.910011   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.910781   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912353   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912882   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.913951   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:57.910011   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.910781   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912353   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912882   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.913951   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:57.917816  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:57.917827  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:57.980535  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:57.980556  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:00.519246  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:00.531028  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:00.531090  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:00.557919  420062 cri.go:89] found id: ""
	I1217 20:34:00.557933  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.557941  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:00.557947  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:00.558006  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:00.583357  420062 cri.go:89] found id: ""
	I1217 20:34:00.583381  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.583389  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:00.583394  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:00.583461  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:00.608300  420062 cri.go:89] found id: ""
	I1217 20:34:00.608313  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.608321  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:00.608326  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:00.608396  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:00.633249  420062 cri.go:89] found id: ""
	I1217 20:34:00.633263  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.633271  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:00.633277  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:00.633354  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:00.657998  420062 cri.go:89] found id: ""
	I1217 20:34:00.658012  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.658020  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:00.658025  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:00.658083  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:00.686479  420062 cri.go:89] found id: ""
	I1217 20:34:00.686494  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.686502  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:00.686517  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:00.686600  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:00.715237  420062 cri.go:89] found id: ""
	I1217 20:34:00.715251  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.715259  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:00.715281  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:00.715297  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:00.771736  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:00.771756  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:00.786569  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:00.786584  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:00.855532  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:00.846820   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.847617   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849290   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849821   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.851435   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:00.846820   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.847617   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849290   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849821   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.851435   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:00.855544  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:00.855556  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:00.929889  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:00.929917  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:03.457778  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:03.467767  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:03.467830  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:03.491745  420062 cri.go:89] found id: ""
	I1217 20:34:03.491760  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.491767  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:03.491772  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:03.491834  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:03.516486  420062 cri.go:89] found id: ""
	I1217 20:34:03.516501  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.516508  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:03.516514  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:03.516573  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:03.545504  420062 cri.go:89] found id: ""
	I1217 20:34:03.545518  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.545526  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:03.545531  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:03.545592  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:03.570752  420062 cri.go:89] found id: ""
	I1217 20:34:03.570766  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.570773  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:03.570779  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:03.570837  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:03.599464  420062 cri.go:89] found id: ""
	I1217 20:34:03.599478  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.599486  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:03.599491  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:03.599551  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:03.626193  420062 cri.go:89] found id: ""
	I1217 20:34:03.626209  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.626217  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:03.626222  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:03.626280  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:03.650682  420062 cri.go:89] found id: ""
	I1217 20:34:03.650696  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.650704  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:03.650712  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:03.650724  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:03.712614  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:03.705244   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.705869   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.706805   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.707331   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.708827   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:03.705244   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.705869   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.706805   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.707331   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.708827   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:03.712625  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:03.712636  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:03.775226  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:03.775247  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:03.801581  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:03.801600  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:03.857991  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:03.858013  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:06.373018  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:06.382912  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:06.382972  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:06.408596  420062 cri.go:89] found id: ""
	I1217 20:34:06.408610  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.408617  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:06.408622  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:06.408681  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:06.437062  420062 cri.go:89] found id: ""
	I1217 20:34:06.437076  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.437083  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:06.437088  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:06.437149  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:06.463109  420062 cri.go:89] found id: ""
	I1217 20:34:06.463123  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.463130  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:06.463135  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:06.463198  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:06.487450  420062 cri.go:89] found id: ""
	I1217 20:34:06.487463  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.487470  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:06.487476  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:06.487537  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:06.512848  420062 cri.go:89] found id: ""
	I1217 20:34:06.512863  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.512870  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:06.512876  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:06.512939  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:06.536984  420062 cri.go:89] found id: ""
	I1217 20:34:06.536998  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.537006  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:06.537011  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:06.537069  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:06.565689  420062 cri.go:89] found id: ""
	I1217 20:34:06.565732  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.565740  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:06.565748  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:06.565758  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:06.626274  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:06.626294  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:06.641612  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:06.641630  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:06.703082  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:06.694717   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.695365   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697091   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697739   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.699357   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:06.694717   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.695365   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697091   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697739   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.699357   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:06.703092  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:06.703104  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:06.768202  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:06.768221  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:09.296397  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:09.306558  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:09.306619  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:09.330814  420062 cri.go:89] found id: ""
	I1217 20:34:09.330828  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.330836  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:09.330841  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:09.330900  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:09.360228  420062 cri.go:89] found id: ""
	I1217 20:34:09.360242  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.360270  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:09.360276  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:09.360336  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:09.383852  420062 cri.go:89] found id: ""
	I1217 20:34:09.383865  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.383871  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:09.383876  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:09.383933  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:09.408740  420062 cri.go:89] found id: ""
	I1217 20:34:09.408753  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.408760  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:09.408765  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:09.408824  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:09.433879  420062 cri.go:89] found id: ""
	I1217 20:34:09.433894  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.433901  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:09.433907  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:09.433965  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:09.458138  420062 cri.go:89] found id: ""
	I1217 20:34:09.458152  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.458160  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:09.458165  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:09.458223  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:09.482170  420062 cri.go:89] found id: ""
	I1217 20:34:09.482184  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.482191  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:09.482199  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:09.482214  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:09.539809  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:09.539831  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:09.555108  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:09.555124  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:09.617755  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:09.608834   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.609449   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611182   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611721   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.613344   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:09.608834   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.609449   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611182   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611721   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.613344   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:09.617779  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:09.617790  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:09.680900  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:09.680920  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:12.217262  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:12.227378  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:12.227441  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:12.260904  420062 cri.go:89] found id: ""
	I1217 20:34:12.260918  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.260926  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:12.260931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:12.260991  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:12.290600  420062 cri.go:89] found id: ""
	I1217 20:34:12.290614  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.290621  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:12.290626  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:12.290694  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:12.317694  420062 cri.go:89] found id: ""
	I1217 20:34:12.317708  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.317716  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:12.317721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:12.317789  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:12.347280  420062 cri.go:89] found id: ""
	I1217 20:34:12.347300  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.347308  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:12.347323  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:12.347382  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:12.375032  420062 cri.go:89] found id: ""
	I1217 20:34:12.375046  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.375054  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:12.375060  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:12.375121  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:12.400749  420062 cri.go:89] found id: ""
	I1217 20:34:12.400763  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.400771  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:12.400779  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:12.400837  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:12.425915  420062 cri.go:89] found id: ""
	I1217 20:34:12.425929  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.425937  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:12.425946  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:12.425957  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:12.486250  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:12.486269  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:12.501500  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:12.501515  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:12.571896  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:12.563218   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564136   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564679   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566300   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566801   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:12.563218   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564136   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564679   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566300   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566801   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:12.571906  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:12.571921  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:12.635853  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:12.635876  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:15.166604  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:15.177581  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:15.177645  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:15.201800  420062 cri.go:89] found id: ""
	I1217 20:34:15.201815  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.201822  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:15.201828  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:15.201892  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:15.229609  420062 cri.go:89] found id: ""
	I1217 20:34:15.229624  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.229631  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:15.229636  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:15.229703  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:15.257583  420062 cri.go:89] found id: ""
	I1217 20:34:15.257597  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.257605  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:15.257610  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:15.257673  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:15.291085  420062 cri.go:89] found id: ""
	I1217 20:34:15.291099  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.291106  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:15.291112  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:15.291190  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:15.324198  420062 cri.go:89] found id: ""
	I1217 20:34:15.324212  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.324219  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:15.324226  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:15.324317  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:15.348977  420062 cri.go:89] found id: ""
	I1217 20:34:15.348991  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.348998  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:15.349004  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:15.349069  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:15.373132  420062 cri.go:89] found id: ""
	I1217 20:34:15.373147  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.373155  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:15.373162  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:15.373174  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:15.387711  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:15.387728  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:15.453164  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:15.443181   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.443915   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.445657   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.447470   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.448047   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:15.443181   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.443915   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.445657   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.447470   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.448047   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:15.453175  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:15.453187  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:15.519197  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:15.519219  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:15.547781  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:15.547799  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:18.106475  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:18.117557  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:18.117619  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:18.142233  420062 cri.go:89] found id: ""
	I1217 20:34:18.142246  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.142253  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:18.142258  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:18.142319  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:18.166913  420062 cri.go:89] found id: ""
	I1217 20:34:18.166927  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.166934  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:18.166940  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:18.167002  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:18.195856  420062 cri.go:89] found id: ""
	I1217 20:34:18.195870  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.195877  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:18.195883  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:18.195944  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:18.222291  420062 cri.go:89] found id: ""
	I1217 20:34:18.222306  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.222313  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:18.222318  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:18.222382  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:18.254911  420062 cri.go:89] found id: ""
	I1217 20:34:18.254925  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.254932  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:18.254937  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:18.254996  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:18.299082  420062 cri.go:89] found id: ""
	I1217 20:34:18.299096  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.299103  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:18.299109  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:18.299173  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:18.323848  420062 cri.go:89] found id: ""
	I1217 20:34:18.323862  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.323869  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:18.323877  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:18.323888  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:18.381056  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:18.381082  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:18.395602  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:18.395617  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:18.459223  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:18.450909   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.451543   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453107   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453711   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.455276   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:18.450909   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.451543   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453107   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453711   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.455276   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:18.459233  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:18.459244  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:18.522287  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:18.522307  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:21.051832  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:21.062206  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:21.062275  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:21.090124  420062 cri.go:89] found id: ""
	I1217 20:34:21.090139  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.090146  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:21.090151  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:21.090211  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:21.114268  420062 cri.go:89] found id: ""
	I1217 20:34:21.114282  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.114289  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:21.114294  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:21.114357  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:21.141585  420062 cri.go:89] found id: ""
	I1217 20:34:21.141599  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.141606  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:21.141611  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:21.141673  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:21.167173  420062 cri.go:89] found id: ""
	I1217 20:34:21.167187  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.167195  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:21.167200  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:21.167277  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:21.191543  420062 cri.go:89] found id: ""
	I1217 20:34:21.191557  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.191564  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:21.191569  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:21.191640  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:21.219365  420062 cri.go:89] found id: ""
	I1217 20:34:21.219378  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.219385  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:21.219390  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:21.219451  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:21.256303  420062 cri.go:89] found id: ""
	I1217 20:34:21.256317  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.256324  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:21.256332  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:21.256342  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:21.323014  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:21.323035  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:21.337647  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:21.337664  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:21.400131  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:21.391524   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.392409   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394006   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394305   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.395921   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:21.391524   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.392409   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394006   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394305   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.395921   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:21.400140  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:21.400151  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:21.467704  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:21.467725  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:23.996278  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:24.008421  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:24.008487  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:24.035322  420062 cri.go:89] found id: ""
	I1217 20:34:24.035336  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.035344  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:24.035349  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:24.035413  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:24.060026  420062 cri.go:89] found id: ""
	I1217 20:34:24.060040  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.060048  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:24.060054  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:24.060131  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:24.085236  420062 cri.go:89] found id: ""
	I1217 20:34:24.085250  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.085257  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:24.085263  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:24.085323  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:24.110730  420062 cri.go:89] found id: ""
	I1217 20:34:24.110763  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.110772  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:24.110778  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:24.110851  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:24.138006  420062 cri.go:89] found id: ""
	I1217 20:34:24.138020  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.138028  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:24.138034  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:24.138094  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:24.168065  420062 cri.go:89] found id: ""
	I1217 20:34:24.168080  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.168094  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:24.168100  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:24.168172  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:24.193244  420062 cri.go:89] found id: ""
	I1217 20:34:24.193258  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.193265  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:24.193273  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:24.193284  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:24.260181  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:24.260201  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:24.299429  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:24.299446  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:24.355633  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:24.355653  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:24.371493  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:24.371508  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:24.439767  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:24.431274   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.432160   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.433728   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.434387   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.435768   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:24.431274   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.432160   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.433728   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.434387   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.435768   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:26.940651  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:26.951081  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:26.951148  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:26.975583  420062 cri.go:89] found id: ""
	I1217 20:34:26.975598  420062 logs.go:282] 0 containers: []
	W1217 20:34:26.975606  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:26.975611  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:26.975671  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:27.003924  420062 cri.go:89] found id: ""
	I1217 20:34:27.003939  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.003948  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:27.003954  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:27.004018  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:27.029433  420062 cri.go:89] found id: ""
	I1217 20:34:27.029446  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.029454  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:27.029460  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:27.029520  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:27.055977  420062 cri.go:89] found id: ""
	I1217 20:34:27.055990  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.055998  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:27.056027  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:27.056093  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:27.081756  420062 cri.go:89] found id: ""
	I1217 20:34:27.081770  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.081777  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:27.081783  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:27.081846  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:27.106532  420062 cri.go:89] found id: ""
	I1217 20:34:27.106546  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.106554  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:27.106587  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:27.106651  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:27.131573  420062 cri.go:89] found id: ""
	I1217 20:34:27.131587  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.131595  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:27.131603  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:27.131613  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:27.194270  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:27.194290  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:27.222438  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:27.222453  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:27.284134  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:27.284154  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:27.300336  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:27.300352  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:27.369337  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:27.360889   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.361563   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363199   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363762   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.365407   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:27.360889   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.361563   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363199   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363762   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.365407   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:29.871004  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:29.881325  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:29.881389  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:29.906739  420062 cri.go:89] found id: ""
	I1217 20:34:29.906753  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.906760  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:29.906766  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:29.906828  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:29.935023  420062 cri.go:89] found id: ""
	I1217 20:34:29.935037  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.935045  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:29.935049  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:29.935110  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:29.968427  420062 cri.go:89] found id: ""
	I1217 20:34:29.968442  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.968449  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:29.968454  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:29.968514  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:29.993120  420062 cri.go:89] found id: ""
	I1217 20:34:29.993133  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.993141  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:29.993147  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:29.993208  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:30.038216  420062 cri.go:89] found id: ""
	I1217 20:34:30.038232  420062 logs.go:282] 0 containers: []
	W1217 20:34:30.038240  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:30.038256  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:30.038331  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:30.088044  420062 cri.go:89] found id: ""
	I1217 20:34:30.088059  420062 logs.go:282] 0 containers: []
	W1217 20:34:30.088067  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:30.088080  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:30.088145  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:30.116773  420062 cri.go:89] found id: ""
	I1217 20:34:30.116789  420062 logs.go:282] 0 containers: []
	W1217 20:34:30.116798  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:30.116808  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:30.116819  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:30.175618  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:30.175638  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:30.191950  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:30.191967  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:30.268938  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:30.259892   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.260676   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262229   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262537   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.263971   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:30.259892   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.260676   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262229   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262537   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.263971   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:30.268949  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:30.268960  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:30.345609  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:30.345631  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:32.873852  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:32.884009  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:32.884072  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:32.908673  420062 cri.go:89] found id: ""
	I1217 20:34:32.908688  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.908696  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:32.908701  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:32.908761  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:32.933101  420062 cri.go:89] found id: ""
	I1217 20:34:32.933115  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.933122  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:32.933127  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:32.933192  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:32.956968  420062 cri.go:89] found id: ""
	I1217 20:34:32.956982  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.956991  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:32.956996  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:32.957054  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:32.982228  420062 cri.go:89] found id: ""
	I1217 20:34:32.982241  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.982249  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:32.982254  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:32.982312  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:33.011791  420062 cri.go:89] found id: ""
	I1217 20:34:33.011805  420062 logs.go:282] 0 containers: []
	W1217 20:34:33.011812  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:33.011818  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:33.011885  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:33.038878  420062 cri.go:89] found id: ""
	I1217 20:34:33.038894  420062 logs.go:282] 0 containers: []
	W1217 20:34:33.038901  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:33.038907  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:33.038969  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:33.068421  420062 cri.go:89] found id: ""
	I1217 20:34:33.068436  420062 logs.go:282] 0 containers: []
	W1217 20:34:33.068443  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:33.068453  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:33.068463  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:33.083444  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:33.083461  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:33.147593  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:33.139067   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.139640   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141533   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141989   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.143520   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:33.139067   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.139640   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141533   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141989   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.143520   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:33.147604  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:33.147617  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:33.211005  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:33.211025  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:33.247311  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:33.247327  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:35.820692  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:35.830805  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:35.830879  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:35.855694  420062 cri.go:89] found id: ""
	I1217 20:34:35.855708  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.855716  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:35.855721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:35.855780  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:35.879932  420062 cri.go:89] found id: ""
	I1217 20:34:35.879947  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.879955  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:35.879960  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:35.880021  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:35.904606  420062 cri.go:89] found id: ""
	I1217 20:34:35.904622  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.904630  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:35.904635  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:35.904700  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:35.932655  420062 cri.go:89] found id: ""
	I1217 20:34:35.932669  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.932676  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:35.932681  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:35.932742  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:35.956665  420062 cri.go:89] found id: ""
	I1217 20:34:35.956679  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.956686  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:35.956691  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:35.956748  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:35.981363  420062 cri.go:89] found id: ""
	I1217 20:34:35.981377  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.981385  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:35.981391  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:35.981450  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:36.013052  420062 cri.go:89] found id: ""
	I1217 20:34:36.013068  420062 logs.go:282] 0 containers: []
	W1217 20:34:36.013076  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:36.013084  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:36.013097  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:36.080346  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:36.080367  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:36.109280  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:36.109296  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:36.168612  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:36.168630  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:36.183490  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:36.183505  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:36.254206  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:36.245334   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.246226   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.247937   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.248300   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.249802   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:36.245334   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.246226   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.247937   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.248300   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.249802   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:38.754461  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:38.764820  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:38.764885  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:38.790226  420062 cri.go:89] found id: ""
	I1217 20:34:38.790243  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.790251  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:38.790257  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:38.790317  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:38.815898  420062 cri.go:89] found id: ""
	I1217 20:34:38.815913  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.815920  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:38.815925  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:38.815986  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:38.840879  420062 cri.go:89] found id: ""
	I1217 20:34:38.840894  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.840901  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:38.840907  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:38.840967  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:38.865756  420062 cri.go:89] found id: ""
	I1217 20:34:38.865772  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.865780  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:38.865785  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:38.865851  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:38.893497  420062 cri.go:89] found id: ""
	I1217 20:34:38.893511  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.893518  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:38.893523  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:38.893582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:38.918737  420062 cri.go:89] found id: ""
	I1217 20:34:38.918751  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.918758  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:38.918763  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:38.918821  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:38.943126  420062 cri.go:89] found id: ""
	I1217 20:34:38.943140  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.943147  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:38.943155  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:38.943166  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:39.008933  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:38.999020   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.000025   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.001953   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.002737   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.004715   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:38.999020   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.000025   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.001953   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.002737   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.004715   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:39.008944  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:39.008955  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:39.071529  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:39.071550  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:39.098851  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:39.098866  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:39.157559  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:39.157578  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:41.673292  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:41.683569  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:41.683631  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:41.712444  420062 cri.go:89] found id: ""
	I1217 20:34:41.712458  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.712466  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:41.712471  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:41.712540  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:41.737230  420062 cri.go:89] found id: ""
	I1217 20:34:41.737244  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.737253  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:41.737258  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:41.737320  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:41.765904  420062 cri.go:89] found id: ""
	I1217 20:34:41.765918  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.765926  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:41.765931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:41.765993  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:41.790803  420062 cri.go:89] found id: ""
	I1217 20:34:41.790818  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.790826  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:41.790831  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:41.790891  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:41.816378  420062 cri.go:89] found id: ""
	I1217 20:34:41.816393  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.816399  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:41.816405  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:41.816465  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:41.846163  420062 cri.go:89] found id: ""
	I1217 20:34:41.846177  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.846184  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:41.846190  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:41.846249  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:41.874235  420062 cri.go:89] found id: ""
	I1217 20:34:41.874249  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.874257  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:41.874264  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:41.874278  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:41.930007  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:41.930025  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:41.944733  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:41.944748  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:42.015145  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:42.005958   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.007326   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.008416   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009480   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009948   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:42.005958   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.007326   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.008416   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009480   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009948   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:42.015157  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:42.015168  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:42.083018  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:42.083046  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:44.617783  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:44.627898  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:44.627959  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:44.654510  420062 cri.go:89] found id: ""
	I1217 20:34:44.654524  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.654531  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:44.654536  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:44.654600  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:44.681532  420062 cri.go:89] found id: ""
	I1217 20:34:44.681547  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.681554  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:44.681560  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:44.681620  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:44.705927  420062 cri.go:89] found id: ""
	I1217 20:34:44.705941  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.705948  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:44.705953  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:44.706010  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:44.730835  420062 cri.go:89] found id: ""
	I1217 20:34:44.730849  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.730857  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:44.730862  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:44.730925  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:44.754987  420062 cri.go:89] found id: ""
	I1217 20:34:44.755002  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.755009  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:44.755014  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:44.755074  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:44.778787  420062 cri.go:89] found id: ""
	I1217 20:34:44.778801  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.778808  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:44.778814  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:44.778874  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:44.804370  420062 cri.go:89] found id: ""
	I1217 20:34:44.804385  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.804392  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:44.804401  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:44.804411  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:44.870852  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:44.870872  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:44.901529  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:44.901545  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:44.961405  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:44.961428  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:44.976411  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:44.976427  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:45.055180  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:45.045055   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.046486   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.047127   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.048790   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.049451   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:45.045055   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.046486   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.047127   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.048790   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.049451   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:47.555437  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:47.565320  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:47.565380  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:47.594473  420062 cri.go:89] found id: ""
	I1217 20:34:47.594488  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.594495  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:47.594500  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:47.594560  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:47.618819  420062 cri.go:89] found id: ""
	I1217 20:34:47.618833  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.618840  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:47.618845  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:47.618906  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:47.643299  420062 cri.go:89] found id: ""
	I1217 20:34:47.643313  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.643320  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:47.643325  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:47.643386  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:47.668500  420062 cri.go:89] found id: ""
	I1217 20:34:47.668514  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.668522  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:47.668527  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:47.668588  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:47.694650  420062 cri.go:89] found id: ""
	I1217 20:34:47.694671  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.694678  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:47.694683  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:47.694745  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:47.729169  420062 cri.go:89] found id: ""
	I1217 20:34:47.729183  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.729192  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:47.729197  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:47.729258  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:47.753481  420062 cri.go:89] found id: ""
	I1217 20:34:47.753494  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.753501  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:47.753509  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:47.753521  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:47.768175  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:47.768192  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:47.832224  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:47.823643   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.824432   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826211   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826814   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.828509   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:47.823643   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.824432   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826211   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826814   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.828509   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:47.832234  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:47.832264  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:47.894275  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:47.894294  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:47.921621  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:47.921638  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:50.477347  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:50.487837  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:50.487905  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:50.515440  420062 cri.go:89] found id: ""
	I1217 20:34:50.515460  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.515468  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:50.515473  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:50.515545  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:50.542521  420062 cri.go:89] found id: ""
	I1217 20:34:50.542546  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.542553  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:50.542559  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:50.542629  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:50.569586  420062 cri.go:89] found id: ""
	I1217 20:34:50.569600  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.569613  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:50.569618  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:50.569677  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:50.597938  420062 cri.go:89] found id: ""
	I1217 20:34:50.597951  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.597958  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:50.597966  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:50.598024  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:50.627019  420062 cri.go:89] found id: ""
	I1217 20:34:50.627044  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.627052  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:50.627057  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:50.627128  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:50.655921  420062 cri.go:89] found id: ""
	I1217 20:34:50.655948  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.655956  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:50.655962  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:50.656028  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:50.680457  420062 cri.go:89] found id: ""
	I1217 20:34:50.680471  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.680479  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:50.680487  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:50.680502  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:50.742350  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:50.734040   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.734460   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736277   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736697   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.738252   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:50.734040   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.734460   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736277   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736697   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.738252   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:50.742360  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:50.742370  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:50.802977  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:50.802997  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:50.830354  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:50.830370  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:50.887850  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:50.887869  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:53.403065  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:53.413162  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:53.413227  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:53.437500  420062 cri.go:89] found id: ""
	I1217 20:34:53.437513  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.437521  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:53.437526  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:53.437592  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:53.462889  420062 cri.go:89] found id: ""
	I1217 20:34:53.462902  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.462910  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:53.462915  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:53.462972  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:53.493212  420062 cri.go:89] found id: ""
	I1217 20:34:53.493226  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.493234  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:53.493239  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:53.493301  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:53.521829  420062 cri.go:89] found id: ""
	I1217 20:34:53.521844  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.521851  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:53.521857  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:53.521919  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:53.558427  420062 cri.go:89] found id: ""
	I1217 20:34:53.558442  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.558449  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:53.558454  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:53.558513  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:53.583439  420062 cri.go:89] found id: ""
	I1217 20:34:53.583453  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.583460  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:53.583466  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:53.583526  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:53.608693  420062 cri.go:89] found id: ""
	I1217 20:34:53.608707  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.608714  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:53.608722  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:53.608732  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:53.664959  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:53.664980  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:53.679865  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:53.679886  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:53.742568  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:53.733840   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.734623   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736275   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736848   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.738561   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:53.733840   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.734623   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736275   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736848   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.738561   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:53.742579  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:53.742591  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:53.803297  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:53.803317  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:56.335304  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:56.344915  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:56.344977  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:56.368289  420062 cri.go:89] found id: ""
	I1217 20:34:56.368304  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.368312  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:56.368319  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:56.368388  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:56.392693  420062 cri.go:89] found id: ""
	I1217 20:34:56.392707  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.392715  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:56.392721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:56.392782  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:56.419795  420062 cri.go:89] found id: ""
	I1217 20:34:56.419809  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.419825  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:56.419834  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:56.419902  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:56.445038  420062 cri.go:89] found id: ""
	I1217 20:34:56.445052  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.445060  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:56.445065  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:56.445128  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:56.474272  420062 cri.go:89] found id: ""
	I1217 20:34:56.474287  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.474294  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:56.474300  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:56.474366  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:56.507935  420062 cri.go:89] found id: ""
	I1217 20:34:56.507950  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.507957  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:56.507963  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:56.508030  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:56.535999  420062 cri.go:89] found id: ""
	I1217 20:34:56.536012  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.536030  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:56.536039  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:56.536050  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:56.572020  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:56.572037  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:56.628661  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:56.628681  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:56.643833  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:56.643856  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:56.710351  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:56.701895   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.702686   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704396   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704960   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.706438   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:56.701895   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.702686   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704396   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704960   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.706438   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:56.710361  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:56.710380  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:59.273579  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:59.283581  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:59.283645  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:59.309480  420062 cri.go:89] found id: ""
	I1217 20:34:59.309493  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.309500  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:59.309506  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:59.309564  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:59.333365  420062 cri.go:89] found id: ""
	I1217 20:34:59.333378  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.333386  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:59.333391  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:59.333452  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:59.357207  420062 cri.go:89] found id: ""
	I1217 20:34:59.357221  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.357228  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:59.357233  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:59.357298  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:59.381758  420062 cri.go:89] found id: ""
	I1217 20:34:59.381772  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.381781  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:59.381787  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:59.381845  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:59.406750  420062 cri.go:89] found id: ""
	I1217 20:34:59.406764  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.406772  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:59.406777  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:59.406845  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:59.431825  420062 cri.go:89] found id: ""
	I1217 20:34:59.431838  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.431846  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:59.431852  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:59.431913  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:59.458993  420062 cri.go:89] found id: ""
	I1217 20:34:59.459007  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.459014  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:59.459022  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:59.459041  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:59.546381  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:59.527767   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.528143   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.536500   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.537248   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.538811   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:59.527767   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.528143   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.536500   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.537248   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.538811   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:59.546391  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:59.546401  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:59.613987  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:59.614007  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:59.644296  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:59.644311  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:59.703226  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:59.703245  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:02.218783  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:02.229042  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:02.229114  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:02.254286  420062 cri.go:89] found id: ""
	I1217 20:35:02.254300  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.254308  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:02.254315  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:02.254374  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:02.281092  420062 cri.go:89] found id: ""
	I1217 20:35:02.281106  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.281114  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:02.281120  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:02.281198  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:02.310195  420062 cri.go:89] found id: ""
	I1217 20:35:02.310209  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.310217  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:02.310222  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:02.310294  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:02.338807  420062 cri.go:89] found id: ""
	I1217 20:35:02.338821  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.338829  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:02.338834  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:02.338904  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:02.364604  420062 cri.go:89] found id: ""
	I1217 20:35:02.364618  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.364625  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:02.364631  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:02.364693  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:02.389458  420062 cri.go:89] found id: ""
	I1217 20:35:02.389473  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.389481  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:02.389486  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:02.389544  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:02.419120  420062 cri.go:89] found id: ""
	I1217 20:35:02.419134  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.419142  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:02.419151  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:02.419162  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:02.476620  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:02.476640  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:02.492411  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:02.492428  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:02.567285  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:02.558682   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.559341   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.560957   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.561461   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.562999   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:02.558682   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.559341   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.560957   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.561461   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.562999   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:02.567294  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:02.567308  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:02.635002  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:02.635022  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:05.163567  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:05.174184  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:05.174245  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:05.199116  420062 cri.go:89] found id: ""
	I1217 20:35:05.199130  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.199137  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:05.199143  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:05.199206  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:05.223477  420062 cri.go:89] found id: ""
	I1217 20:35:05.223491  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.223498  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:05.223504  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:05.223562  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:05.247303  420062 cri.go:89] found id: ""
	I1217 20:35:05.247317  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.247325  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:05.247332  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:05.247391  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:05.272620  420062 cri.go:89] found id: ""
	I1217 20:35:05.272633  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.272641  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:05.272646  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:05.272703  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:05.300419  420062 cri.go:89] found id: ""
	I1217 20:35:05.300434  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.300441  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:05.300446  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:05.300505  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:05.325851  420062 cri.go:89] found id: ""
	I1217 20:35:05.325866  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.325873  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:05.325879  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:05.325938  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:05.354430  420062 cri.go:89] found id: ""
	I1217 20:35:05.354445  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.354452  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:05.354460  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:05.354475  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:05.369668  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:05.369686  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:05.436390  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:05.427472   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.428087   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.429823   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.430630   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.432463   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:05.427472   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.428087   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.429823   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.430630   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.432463   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:05.436400  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:05.436411  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:05.499177  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:05.499202  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:05.531231  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:05.531248  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:08.088375  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:08.098640  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:08.098711  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:08.132112  420062 cri.go:89] found id: ""
	I1217 20:35:08.132127  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.132136  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:08.132141  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:08.132205  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:08.157778  420062 cri.go:89] found id: ""
	I1217 20:35:08.157792  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.157800  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:08.157805  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:08.157862  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:08.183372  420062 cri.go:89] found id: ""
	I1217 20:35:08.183386  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.183393  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:08.183399  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:08.183457  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:08.208186  420062 cri.go:89] found id: ""
	I1217 20:35:08.208200  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.208207  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:08.208212  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:08.208310  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:08.236181  420062 cri.go:89] found id: ""
	I1217 20:35:08.236195  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.236202  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:08.236207  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:08.236313  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:08.261508  420062 cri.go:89] found id: ""
	I1217 20:35:08.261522  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.261529  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:08.261534  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:08.261593  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:08.286303  420062 cri.go:89] found id: ""
	I1217 20:35:08.286318  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.286325  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:08.286333  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:08.286349  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:08.345547  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:08.345573  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:08.360551  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:08.360568  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:08.424581  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:08.415775   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.416570   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.418257   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.419005   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.420742   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:08.415775   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.416570   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.418257   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.419005   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.420742   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:08.424593  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:08.424606  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:08.489146  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:08.489166  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:11.022570  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:11.034138  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:11.034205  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:11.066795  420062 cri.go:89] found id: ""
	I1217 20:35:11.066810  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.066817  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:11.066825  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:11.066888  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:11.092902  420062 cri.go:89] found id: ""
	I1217 20:35:11.092917  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.092925  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:11.092931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:11.092998  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:11.120040  420062 cri.go:89] found id: ""
	I1217 20:35:11.120056  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.120064  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:11.120069  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:11.120138  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:11.150096  420062 cri.go:89] found id: ""
	I1217 20:35:11.150111  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.150118  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:11.150124  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:11.150186  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:11.178952  420062 cri.go:89] found id: ""
	I1217 20:35:11.178966  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.178973  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:11.178979  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:11.179042  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:11.205194  420062 cri.go:89] found id: ""
	I1217 20:35:11.205208  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.205215  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:11.205221  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:11.205281  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:11.231314  420062 cri.go:89] found id: ""
	I1217 20:35:11.231327  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.231335  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:11.231343  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:11.231355  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:11.246458  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:11.246475  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:11.312684  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:11.304393   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.305171   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.306693   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.307058   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.308710   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:11.304393   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.305171   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.306693   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.307058   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.308710   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:11.312696  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:11.312706  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:11.379354  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:11.379374  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:11.413484  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:11.413500  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:13.972078  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:13.982223  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:13.982290  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:14.022488  420062 cri.go:89] found id: ""
	I1217 20:35:14.022502  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.022510  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:14.022515  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:14.022575  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:14.059328  420062 cri.go:89] found id: ""
	I1217 20:35:14.059342  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.059364  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:14.059369  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:14.059435  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:14.085531  420062 cri.go:89] found id: ""
	I1217 20:35:14.085544  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.085552  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:14.085558  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:14.085616  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:14.114113  420062 cri.go:89] found id: ""
	I1217 20:35:14.114134  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.114141  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:14.114147  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:14.114210  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:14.138505  420062 cri.go:89] found id: ""
	I1217 20:35:14.138519  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.138526  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:14.138532  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:14.138591  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:14.162838  420062 cri.go:89] found id: ""
	I1217 20:35:14.162852  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.162858  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:14.162863  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:14.162923  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:14.190631  420062 cri.go:89] found id: ""
	I1217 20:35:14.190651  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.190665  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:14.190672  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:14.190682  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:14.246544  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:14.246563  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:14.261703  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:14.261719  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:14.327698  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:14.319587   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.320376   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322035   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322354   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.323849   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:14.319587   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.320376   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322035   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322354   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.323849   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:14.327708  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:14.327721  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:14.391616  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:14.391635  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:16.921553  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:16.931542  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:16.931604  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:16.955206  420062 cri.go:89] found id: ""
	I1217 20:35:16.955220  420062 logs.go:282] 0 containers: []
	W1217 20:35:16.955227  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:16.955233  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:16.955291  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:16.984598  420062 cri.go:89] found id: ""
	I1217 20:35:16.984613  420062 logs.go:282] 0 containers: []
	W1217 20:35:16.984620  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:16.984625  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:16.984683  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:17.033712  420062 cri.go:89] found id: ""
	I1217 20:35:17.033726  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.033733  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:17.033739  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:17.033796  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:17.061936  420062 cri.go:89] found id: ""
	I1217 20:35:17.061950  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.061957  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:17.061963  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:17.062023  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:17.086921  420062 cri.go:89] found id: ""
	I1217 20:35:17.086936  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.086943  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:17.086948  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:17.087009  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:17.112474  420062 cri.go:89] found id: ""
	I1217 20:35:17.112488  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.112495  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:17.112501  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:17.112558  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:17.137847  420062 cri.go:89] found id: ""
	I1217 20:35:17.137867  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.137875  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:17.137882  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:17.137892  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:17.198885  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:17.198904  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:17.213637  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:17.213652  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:17.281467  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:17.272943   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.273684   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275273   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275893   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.277419   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:17.272943   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.273684   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275273   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275893   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.277419   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:17.281478  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:17.281488  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:17.343313  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:17.343334  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:19.871984  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:19.882066  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:19.882128  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:19.907664  420062 cri.go:89] found id: ""
	I1217 20:35:19.907678  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.907686  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:19.907691  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:19.907750  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:19.936014  420062 cri.go:89] found id: ""
	I1217 20:35:19.936028  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.936035  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:19.936040  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:19.936099  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:19.961865  420062 cri.go:89] found id: ""
	I1217 20:35:19.961881  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.961888  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:19.961893  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:19.961954  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:19.988749  420062 cri.go:89] found id: ""
	I1217 20:35:19.988762  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.988769  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:19.988775  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:19.988832  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:20.021844  420062 cri.go:89] found id: ""
	I1217 20:35:20.021859  420062 logs.go:282] 0 containers: []
	W1217 20:35:20.021866  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:20.021873  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:20.021936  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:20.064328  420062 cri.go:89] found id: ""
	I1217 20:35:20.064343  420062 logs.go:282] 0 containers: []
	W1217 20:35:20.064351  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:20.064356  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:20.064464  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:20.092230  420062 cri.go:89] found id: ""
	I1217 20:35:20.092244  420062 logs.go:282] 0 containers: []
	W1217 20:35:20.092272  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:20.092280  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:20.092291  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:20.150597  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:20.150617  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:20.166734  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:20.166751  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:20.235344  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:20.226511   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.227342   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.228855   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.229349   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.230876   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:20.226511   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.227342   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.228855   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.229349   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.230876   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:20.235354  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:20.235368  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:20.300971  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:20.300991  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:22.830503  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:22.840565  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:22.840627  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:22.865965  420062 cri.go:89] found id: ""
	I1217 20:35:22.865980  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.865987  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:22.865992  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:22.866051  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:22.890981  420062 cri.go:89] found id: ""
	I1217 20:35:22.890995  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.891002  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:22.891007  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:22.891067  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:22.916050  420062 cri.go:89] found id: ""
	I1217 20:35:22.916064  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.916070  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:22.916075  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:22.916134  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:22.940231  420062 cri.go:89] found id: ""
	I1217 20:35:22.940244  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.940274  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:22.940280  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:22.940338  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:22.964651  420062 cri.go:89] found id: ""
	I1217 20:35:22.964665  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.964673  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:22.964678  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:22.964739  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:22.999102  420062 cri.go:89] found id: ""
	I1217 20:35:22.999118  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.999126  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:22.999133  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:22.999201  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:23.031417  420062 cri.go:89] found id: ""
	I1217 20:35:23.031431  420062 logs.go:282] 0 containers: []
	W1217 20:35:23.031440  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:23.031447  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:23.031458  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:23.099279  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:23.099300  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:23.127896  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:23.127914  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:23.184706  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:23.184725  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:23.199879  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:23.199895  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:23.267184  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:23.258603   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.259294   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.260943   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.261532   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.263117   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:23.258603   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.259294   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.260943   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.261532   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.263117   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:25.768885  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:25.778947  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:25.779017  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:25.802991  420062 cri.go:89] found id: ""
	I1217 20:35:25.803005  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.803025  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:25.803031  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:25.803093  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:25.830724  420062 cri.go:89] found id: ""
	I1217 20:35:25.830738  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.830745  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:25.830751  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:25.830813  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:25.860059  420062 cri.go:89] found id: ""
	I1217 20:35:25.860073  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.860081  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:25.860085  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:25.860150  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:25.896087  420062 cri.go:89] found id: ""
	I1217 20:35:25.896101  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.896108  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:25.896114  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:25.896173  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:25.921891  420062 cri.go:89] found id: ""
	I1217 20:35:25.921905  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.921912  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:25.921918  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:25.921975  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:25.946115  420062 cri.go:89] found id: ""
	I1217 20:35:25.946129  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.946137  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:25.946142  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:25.946199  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:25.970696  420062 cri.go:89] found id: ""
	I1217 20:35:25.970711  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.970719  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:25.970727  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:25.970737  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:26.031476  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:26.031497  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:26.053026  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:26.053044  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:26.121268  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:26.112221   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.113175   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.114729   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.115229   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.116856   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:26.112221   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.113175   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.114729   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.115229   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.116856   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:26.121279  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:26.121290  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:26.183866  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:26.183888  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:28.713125  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:28.723373  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:28.723436  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:28.750204  420062 cri.go:89] found id: ""
	I1217 20:35:28.750218  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.750225  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:28.750231  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:28.750295  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:28.774507  420062 cri.go:89] found id: ""
	I1217 20:35:28.774520  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.774528  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:28.774533  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:28.774593  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:28.799202  420062 cri.go:89] found id: ""
	I1217 20:35:28.799217  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.799225  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:28.799230  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:28.799295  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:28.823894  420062 cri.go:89] found id: ""
	I1217 20:35:28.823908  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.823916  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:28.823921  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:28.823981  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:28.848696  420062 cri.go:89] found id: ""
	I1217 20:35:28.848710  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.848717  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:28.848722  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:28.848780  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:28.874108  420062 cri.go:89] found id: ""
	I1217 20:35:28.874121  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.874129  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:28.874146  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:28.874206  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:28.899607  420062 cri.go:89] found id: ""
	I1217 20:35:28.899621  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.899628  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:28.899636  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:28.899646  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:28.955990  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:28.956010  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:28.970828  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:28.970844  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:29.048596  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:29.039925   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.040773   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042371   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042731   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.044197   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:29.039925   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.040773   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042371   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042731   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.044197   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:29.048606  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:29.048627  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:29.115475  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:29.115495  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:31.644907  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:31.654819  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:31.654879  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:31.678281  420062 cri.go:89] found id: ""
	I1217 20:35:31.678295  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.678303  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:31.678308  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:31.678370  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:31.702902  420062 cri.go:89] found id: ""
	I1217 20:35:31.702916  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.702923  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:31.702929  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:31.702988  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:31.730614  420062 cri.go:89] found id: ""
	I1217 20:35:31.730629  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.730643  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:31.730648  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:31.730715  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:31.757724  420062 cri.go:89] found id: ""
	I1217 20:35:31.757738  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.757745  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:31.757751  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:31.757821  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:31.781313  420062 cri.go:89] found id: ""
	I1217 20:35:31.781326  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.781333  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:31.781338  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:31.781401  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:31.805048  420062 cri.go:89] found id: ""
	I1217 20:35:31.805061  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.805068  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:31.805074  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:31.805133  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:31.829157  420062 cri.go:89] found id: ""
	I1217 20:35:31.829172  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.829178  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:31.829186  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:31.829211  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:31.884232  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:31.884262  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:31.899125  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:31.899143  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:31.960768  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:31.952914   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.953466   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.954986   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.955445   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.957040   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:31.952914   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.953466   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.954986   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.955445   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.957040   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:31.960779  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:31.960789  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:32.026560  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:32.026580  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:34.561956  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:34.573345  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:34.573414  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:34.601971  420062 cri.go:89] found id: ""
	I1217 20:35:34.601985  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.601993  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:34.601998  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:34.602057  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:34.631487  420062 cri.go:89] found id: ""
	I1217 20:35:34.631500  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.631508  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:34.631513  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:34.631572  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:34.656452  420062 cri.go:89] found id: ""
	I1217 20:35:34.656465  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.656473  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:34.656478  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:34.656540  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:34.682582  420062 cri.go:89] found id: ""
	I1217 20:35:34.682596  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.682603  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:34.682609  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:34.682676  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:34.713925  420062 cri.go:89] found id: ""
	I1217 20:35:34.713939  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.713947  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:34.713952  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:34.714017  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:34.742385  420062 cri.go:89] found id: ""
	I1217 20:35:34.742400  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.742408  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:34.742414  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:34.742473  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:34.767035  420062 cri.go:89] found id: ""
	I1217 20:35:34.767049  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.767056  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:34.767064  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:34.767075  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:34.822796  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:34.822817  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:34.837590  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:34.837613  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:34.900508  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:34.892940   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.893576   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895113   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895412   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.896864   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:34.892940   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.893576   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895113   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895412   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.896864   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:34.900518  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:34.900529  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:34.962881  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:34.962905  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:37.494984  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:37.505451  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:37.505514  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:37.530852  420062 cri.go:89] found id: ""
	I1217 20:35:37.530866  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.530874  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:37.530885  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:37.530948  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:37.555283  420062 cri.go:89] found id: ""
	I1217 20:35:37.555298  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.555305  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:37.555319  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:37.555384  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:37.580310  420062 cri.go:89] found id: ""
	I1217 20:35:37.580324  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.580342  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:37.580347  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:37.580407  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:37.604561  420062 cri.go:89] found id: ""
	I1217 20:35:37.604575  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.604582  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:37.604587  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:37.604649  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:37.633577  420062 cri.go:89] found id: ""
	I1217 20:35:37.633591  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.633598  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:37.633603  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:37.633668  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:37.659137  420062 cri.go:89] found id: ""
	I1217 20:35:37.659152  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.659159  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:37.659183  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:37.659280  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:37.687689  420062 cri.go:89] found id: ""
	I1217 20:35:37.687704  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.687711  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:37.687719  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:37.687738  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:37.742459  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:37.742478  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:37.757175  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:37.757191  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:37.822005  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:37.813077   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.813679   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.815702   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.816474   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.817981   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:37.813077   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.813679   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.815702   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.816474   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.817981   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:37.822015  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:37.822025  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:37.885848  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:37.885870  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:40.416602  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:40.427031  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:40.427099  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:40.452190  420062 cri.go:89] found id: ""
	I1217 20:35:40.452204  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.452212  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:40.452218  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:40.452299  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:40.478942  420062 cri.go:89] found id: ""
	I1217 20:35:40.478956  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.478963  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:40.478969  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:40.479027  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:40.504873  420062 cri.go:89] found id: ""
	I1217 20:35:40.504886  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.504893  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:40.504898  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:40.504958  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:40.530153  420062 cri.go:89] found id: ""
	I1217 20:35:40.530167  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.530173  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:40.530179  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:40.530239  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:40.558703  420062 cri.go:89] found id: ""
	I1217 20:35:40.558717  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.558725  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:40.558731  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:40.558799  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:40.583753  420062 cri.go:89] found id: ""
	I1217 20:35:40.583768  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.583777  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:40.583793  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:40.583856  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:40.608061  420062 cri.go:89] found id: ""
	I1217 20:35:40.608075  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.608083  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:40.608099  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:40.608111  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:40.665201  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:40.665222  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:40.680290  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:40.680307  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:40.752424  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:40.739302   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.740073   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.745453   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.746616   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.748372   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:40.739302   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.740073   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.745453   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.746616   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.748372   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:40.752435  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:40.752446  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:40.819510  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:40.819535  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:43.356404  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:43.367228  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:43.367293  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:43.391809  420062 cri.go:89] found id: ""
	I1217 20:35:43.391824  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.391831  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:43.391836  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:43.391895  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:43.417869  420062 cri.go:89] found id: ""
	I1217 20:35:43.417883  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.417890  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:43.417895  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:43.417959  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:43.443314  420062 cri.go:89] found id: ""
	I1217 20:35:43.443328  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.443335  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:43.443340  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:43.443400  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:43.469332  420062 cri.go:89] found id: ""
	I1217 20:35:43.469346  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.469352  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:43.469358  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:43.469418  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:43.494242  420062 cri.go:89] found id: ""
	I1217 20:35:43.494256  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.494264  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:43.494277  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:43.494341  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:43.520502  420062 cri.go:89] found id: ""
	I1217 20:35:43.520515  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.520523  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:43.520529  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:43.520592  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:43.549390  420062 cri.go:89] found id: ""
	I1217 20:35:43.549404  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.549411  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:43.549419  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:43.549435  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:43.565708  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:43.565725  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:43.633544  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:43.624678   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.625383   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627234   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627820   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.629497   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:43.624678   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.625383   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627234   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627820   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.629497   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:43.633555  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:43.633567  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:43.696433  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:43.696457  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:43.727227  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:43.727244  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:46.288373  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:46.298318  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:46.298381  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:46.322903  420062 cri.go:89] found id: ""
	I1217 20:35:46.322918  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.322925  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:46.322931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:46.322992  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:46.347241  420062 cri.go:89] found id: ""
	I1217 20:35:46.347253  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.347260  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:46.347265  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:46.347324  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:46.372209  420062 cri.go:89] found id: ""
	I1217 20:35:46.372222  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.372229  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:46.372235  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:46.372313  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:46.399343  420062 cri.go:89] found id: ""
	I1217 20:35:46.399357  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.399365  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:46.399370  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:46.399430  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:46.425023  420062 cri.go:89] found id: ""
	I1217 20:35:46.425036  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.425051  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:46.425057  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:46.425119  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:46.450066  420062 cri.go:89] found id: ""
	I1217 20:35:46.450080  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.450087  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:46.450092  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:46.450153  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:46.474598  420062 cri.go:89] found id: ""
	I1217 20:35:46.474612  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.474619  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:46.474644  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:46.474654  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:46.536781  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:46.536801  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:46.570140  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:46.570155  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:46.628870  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:46.628888  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:46.643875  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:46.643891  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:46.709883  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:46.701485   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.702111   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.703801   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.704133   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.705726   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:46.701485   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.702111   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.703801   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.704133   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.705726   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:49.210139  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:49.220394  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:49.220461  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:49.256343  420062 cri.go:89] found id: ""
	I1217 20:35:49.256358  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.256365  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:49.256370  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:49.256431  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:49.290171  420062 cri.go:89] found id: ""
	I1217 20:35:49.290185  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.290193  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:49.290198  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:49.290261  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:49.320916  420062 cri.go:89] found id: ""
	I1217 20:35:49.320931  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.320939  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:49.320944  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:49.321003  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:49.345394  420062 cri.go:89] found id: ""
	I1217 20:35:49.345408  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.345415  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:49.345421  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:49.345478  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:49.370339  420062 cri.go:89] found id: ""
	I1217 20:35:49.370353  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.370360  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:49.370365  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:49.370424  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:49.394642  420062 cri.go:89] found id: ""
	I1217 20:35:49.394656  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.394663  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:49.394668  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:49.394734  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:49.422548  420062 cri.go:89] found id: ""
	I1217 20:35:49.422562  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.422569  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:49.422577  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:49.422594  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:49.479225  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:49.479246  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:49.494238  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:49.494255  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:49.560086  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:49.552332   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.552825   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554311   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554738   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.556232   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:49.552332   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.552825   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554311   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554738   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.556232   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:49.560096  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:49.560106  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:49.622094  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:49.622114  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:52.150210  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:52.160168  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:52.160231  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:52.184746  420062 cri.go:89] found id: ""
	I1217 20:35:52.184760  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.184767  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:52.184779  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:52.184835  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:52.209501  420062 cri.go:89] found id: ""
	I1217 20:35:52.209515  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.209522  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:52.209528  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:52.209586  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:52.234558  420062 cri.go:89] found id: ""
	I1217 20:35:52.234571  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.234579  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:52.234584  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:52.234654  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:52.265703  420062 cri.go:89] found id: ""
	I1217 20:35:52.265716  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.265724  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:52.265729  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:52.265794  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:52.297248  420062 cri.go:89] found id: ""
	I1217 20:35:52.297263  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.297270  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:52.297275  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:52.297334  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:52.325342  420062 cri.go:89] found id: ""
	I1217 20:35:52.325355  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.325362  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:52.325367  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:52.325433  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:52.349812  420062 cri.go:89] found id: ""
	I1217 20:35:52.349826  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.349843  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:52.349851  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:52.349862  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:52.380735  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:52.380751  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:52.436131  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:52.436151  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:52.451427  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:52.451445  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:52.518482  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:52.509497   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.510168   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.512343   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.513295   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.514564   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:52.509497   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.510168   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.512343   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.513295   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.514564   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:52.518492  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:52.518503  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:55.081073  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:55.091720  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:55.091797  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:55.117311  420062 cri.go:89] found id: ""
	I1217 20:35:55.117325  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.117333  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:55.117338  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:55.117398  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:55.141668  420062 cri.go:89] found id: ""
	I1217 20:35:55.141683  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.141692  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:55.141697  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:55.141760  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:55.166517  420062 cri.go:89] found id: ""
	I1217 20:35:55.166534  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.166541  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:55.166546  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:55.166611  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:55.191282  420062 cri.go:89] found id: ""
	I1217 20:35:55.191296  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.191304  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:55.191309  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:55.191369  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:55.215605  420062 cri.go:89] found id: ""
	I1217 20:35:55.215619  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.215626  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:55.215631  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:55.215690  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:55.247101  420062 cri.go:89] found id: ""
	I1217 20:35:55.247124  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.247132  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:55.247137  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:55.247205  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:55.288704  420062 cri.go:89] found id: ""
	I1217 20:35:55.288718  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.288725  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:55.288732  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:55.288743  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:55.320382  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:55.320398  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:55.379997  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:55.380016  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:55.394762  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:55.394780  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:55.459997  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:55.451538   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.452219   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.453851   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.454661   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.456164   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:55.451538   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.452219   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.453851   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.454661   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.456164   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:55.460007  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:55.460018  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:58.024408  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:58.035410  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:58.035478  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:58.062124  420062 cri.go:89] found id: ""
	I1217 20:35:58.062138  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.062145  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:58.062151  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:58.062211  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:58.088229  420062 cri.go:89] found id: ""
	I1217 20:35:58.088243  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.088270  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:58.088276  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:58.088335  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:58.113240  420062 cri.go:89] found id: ""
	I1217 20:35:58.113255  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.113261  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:58.113266  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:58.113325  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:58.141811  420062 cri.go:89] found id: ""
	I1217 20:35:58.141825  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.141832  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:58.141837  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:58.141897  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:58.170463  420062 cri.go:89] found id: ""
	I1217 20:35:58.170477  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.170484  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:58.170490  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:58.170548  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:58.194647  420062 cri.go:89] found id: ""
	I1217 20:35:58.194670  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.194678  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:58.194684  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:58.194760  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:58.219714  420062 cri.go:89] found id: ""
	I1217 20:35:58.219728  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.219735  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:58.219743  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:58.219754  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:58.263178  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:58.263194  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:58.325412  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:58.325433  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:58.341419  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:58.341435  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:58.403135  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:58.394644   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.395273   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.396931   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.397587   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.399184   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:58.394644   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.395273   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.396931   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.397587   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.399184   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:58.403147  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:58.403163  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:00.965498  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:00.975759  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:00.975820  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:01.000786  420062 cri.go:89] found id: ""
	I1217 20:36:01.000803  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.000811  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:01.000818  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:01.000892  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:01.025695  420062 cri.go:89] found id: ""
	I1217 20:36:01.025709  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.025716  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:01.025721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:01.025784  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:01.054712  420062 cri.go:89] found id: ""
	I1217 20:36:01.054727  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.054734  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:01.054739  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:01.054799  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:01.083318  420062 cri.go:89] found id: ""
	I1217 20:36:01.083332  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.083340  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:01.083345  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:01.083406  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:01.107939  420062 cri.go:89] found id: ""
	I1217 20:36:01.107954  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.107962  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:01.107968  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:01.108030  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:01.134926  420062 cri.go:89] found id: ""
	I1217 20:36:01.134940  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.134947  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:01.134954  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:01.135018  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:01.161095  420062 cri.go:89] found id: ""
	I1217 20:36:01.161111  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.161121  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:01.161130  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:01.161141  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:01.222094  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:01.222112  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:01.239432  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:01.239449  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:01.331243  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:01.322562   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.323102   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.324888   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.325430   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.327051   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:01.322562   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.323102   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.324888   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.325430   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.327051   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:01.331254  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:01.331265  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:01.398128  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:01.398148  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:03.929660  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:03.940045  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:03.940111  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:03.963644  420062 cri.go:89] found id: ""
	I1217 20:36:03.963658  420062 logs.go:282] 0 containers: []
	W1217 20:36:03.963665  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:03.963670  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:03.963727  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:03.996893  420062 cri.go:89] found id: ""
	I1217 20:36:03.996907  420062 logs.go:282] 0 containers: []
	W1217 20:36:03.996914  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:03.996919  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:03.996987  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:04.028499  420062 cri.go:89] found id: ""
	I1217 20:36:04.028514  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.028530  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:04.028535  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:04.028607  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:04.054700  420062 cri.go:89] found id: ""
	I1217 20:36:04.054715  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.054723  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:04.054728  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:04.054785  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:04.082040  420062 cri.go:89] found id: ""
	I1217 20:36:04.082054  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.082063  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:04.082068  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:04.082131  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:04.107015  420062 cri.go:89] found id: ""
	I1217 20:36:04.107029  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.107037  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:04.107043  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:04.107109  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:04.134634  420062 cri.go:89] found id: ""
	I1217 20:36:04.134648  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.134655  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:04.134663  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:04.134673  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:04.191059  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:04.191079  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:04.206280  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:04.206298  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:04.297698  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:04.288551   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.289379   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.290529   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.291295   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.292969   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:04.288551   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.289379   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.290529   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.291295   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.292969   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:04.297708  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:04.297719  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:04.364378  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:04.364398  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:06.892149  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:06.902353  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:06.902418  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:06.927834  420062 cri.go:89] found id: ""
	I1217 20:36:06.927847  420062 logs.go:282] 0 containers: []
	W1217 20:36:06.927855  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:06.927860  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:06.927925  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:06.952936  420062 cri.go:89] found id: ""
	I1217 20:36:06.952949  420062 logs.go:282] 0 containers: []
	W1217 20:36:06.952956  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:06.952965  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:06.953024  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:06.976184  420062 cri.go:89] found id: ""
	I1217 20:36:06.976198  420062 logs.go:282] 0 containers: []
	W1217 20:36:06.976205  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:06.976210  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:06.976297  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:07.004079  420062 cri.go:89] found id: ""
	I1217 20:36:07.004093  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.004101  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:07.004106  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:07.004167  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:07.029604  420062 cri.go:89] found id: ""
	I1217 20:36:07.029618  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.029625  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:07.029630  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:07.029698  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:07.058618  420062 cri.go:89] found id: ""
	I1217 20:36:07.058637  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.058645  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:07.058650  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:07.058709  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:07.085932  420062 cri.go:89] found id: ""
	I1217 20:36:07.085946  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.085953  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:07.085961  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:07.085972  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:07.100543  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:07.100561  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:07.162557  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:07.154011   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.154703   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.156341   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.157015   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.158662   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:07.154011   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.154703   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.156341   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.157015   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.158662   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:07.162567  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:07.162578  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:07.226244  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:07.226265  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:07.280558  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:07.280574  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:09.844282  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:09.854593  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:09.854676  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:09.883180  420062 cri.go:89] found id: ""
	I1217 20:36:09.883194  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.883202  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:09.883208  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:09.883268  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:09.907225  420062 cri.go:89] found id: ""
	I1217 20:36:09.907240  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.907248  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:09.907254  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:09.907315  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:09.936079  420062 cri.go:89] found id: ""
	I1217 20:36:09.936093  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.936100  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:09.936105  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:09.936167  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:09.961921  420062 cri.go:89] found id: ""
	I1217 20:36:09.961935  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.961943  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:09.961949  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:09.962028  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:09.989285  420062 cri.go:89] found id: ""
	I1217 20:36:09.989299  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.989307  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:09.989312  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:09.989371  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:10.023888  420062 cri.go:89] found id: ""
	I1217 20:36:10.023905  420062 logs.go:282] 0 containers: []
	W1217 20:36:10.023913  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:10.023920  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:10.023992  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:10.056062  420062 cri.go:89] found id: ""
	I1217 20:36:10.056077  420062 logs.go:282] 0 containers: []
	W1217 20:36:10.056084  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:10.056102  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:10.056112  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:10.118144  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:10.118165  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:10.153504  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:10.153521  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:10.209909  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:10.209931  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:10.224930  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:10.224946  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:10.310457  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:10.302878   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.303301   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304492   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304881   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.306456   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:10.302878   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.303301   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304492   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304881   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.306456   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:12.811296  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:12.821279  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:12.821339  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:12.845496  420062 cri.go:89] found id: ""
	I1217 20:36:12.845510  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.845519  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:12.845524  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:12.845582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:12.873951  420062 cri.go:89] found id: ""
	I1217 20:36:12.873966  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.873973  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:12.873978  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:12.874039  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:12.898560  420062 cri.go:89] found id: ""
	I1217 20:36:12.898573  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.898580  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:12.898586  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:12.898661  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:12.931323  420062 cri.go:89] found id: ""
	I1217 20:36:12.931343  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.931350  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:12.931356  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:12.931416  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:12.957667  420062 cri.go:89] found id: ""
	I1217 20:36:12.957680  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.957687  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:12.957692  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:12.957749  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:12.981848  420062 cri.go:89] found id: ""
	I1217 20:36:12.981863  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.981870  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:12.981876  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:12.981934  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:13.007649  420062 cri.go:89] found id: ""
	I1217 20:36:13.007664  420062 logs.go:282] 0 containers: []
	W1217 20:36:13.007671  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:13.007679  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:13.007689  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:13.070827  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:13.070846  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:13.098938  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:13.098954  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:13.155232  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:13.155253  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:13.170218  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:13.170234  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:13.237601  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:13.228684   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.229296   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.230990   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.231505   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.233204   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:13.228684   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.229296   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.230990   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.231505   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.233204   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:15.739451  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:15.749635  420062 kubeadm.go:602] duration metric: took 4m4.768391835s to restartPrimaryControlPlane
	W1217 20:36:15.749706  420062 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1217 20:36:15.749781  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1217 20:36:16.165425  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1217 20:36:16.179463  420062 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1217 20:36:16.187987  420062 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1217 20:36:16.188041  420062 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 20:36:16.195805  420062 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1217 20:36:16.195815  420062 kubeadm.go:158] found existing configuration files:
	
	I1217 20:36:16.195868  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1217 20:36:16.203578  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1217 20:36:16.203633  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1217 20:36:16.211222  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1217 20:36:16.218882  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1217 20:36:16.218939  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 20:36:16.226500  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1217 20:36:16.233980  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1217 20:36:16.234040  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 20:36:16.241486  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1217 20:36:16.250121  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1217 20:36:16.250177  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 20:36:16.257963  420062 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1217 20:36:16.296719  420062 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1217 20:36:16.297028  420062 kubeadm.go:319] [preflight] Running pre-flight checks
	I1217 20:36:16.367021  420062 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1217 20:36:16.367085  420062 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1217 20:36:16.367119  420062 kubeadm.go:319] OS: Linux
	I1217 20:36:16.367163  420062 kubeadm.go:319] CGROUPS_CPU: enabled
	I1217 20:36:16.367211  420062 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1217 20:36:16.367257  420062 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1217 20:36:16.367304  420062 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1217 20:36:16.367351  420062 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1217 20:36:16.367397  420062 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1217 20:36:16.367441  420062 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1217 20:36:16.367493  420062 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1217 20:36:16.367539  420062 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1217 20:36:16.443855  420062 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1217 20:36:16.443958  420062 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1217 20:36:16.444047  420062 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1217 20:36:16.456800  420062 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1217 20:36:16.459720  420062 out.go:252]   - Generating certificates and keys ...
	I1217 20:36:16.459808  420062 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1217 20:36:16.459875  420062 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1217 20:36:16.459957  420062 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1217 20:36:16.460026  420062 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1217 20:36:16.460100  420062 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1217 20:36:16.460156  420062 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1217 20:36:16.460222  420062 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1217 20:36:16.460299  420062 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1217 20:36:16.460377  420062 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1217 20:36:16.460454  420062 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1217 20:36:16.460493  420062 kubeadm.go:319] [certs] Using the existing "sa" key
	I1217 20:36:16.460552  420062 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1217 20:36:16.591707  420062 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1217 20:36:16.773515  420062 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1217 20:36:16.895942  420062 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1217 20:36:17.316963  420062 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1217 20:36:17.418134  420062 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1217 20:36:17.418872  420062 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1217 20:36:17.421748  420062 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1217 20:36:17.424898  420062 out.go:252]   - Booting up control plane ...
	I1217 20:36:17.424999  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1217 20:36:17.425075  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1217 20:36:17.425522  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1217 20:36:17.446706  420062 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1217 20:36:17.446809  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1217 20:36:17.455830  420062 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1217 20:36:17.455925  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1217 20:36:17.455963  420062 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1217 20:36:17.596746  420062 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1217 20:36:17.596869  420062 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1217 20:40:17.595000  420062 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000220112s
	I1217 20:40:17.595032  420062 kubeadm.go:319] 
	I1217 20:40:17.595086  420062 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1217 20:40:17.595116  420062 kubeadm.go:319] 	- The kubelet is not running
	I1217 20:40:17.595215  420062 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1217 20:40:17.595220  420062 kubeadm.go:319] 
	I1217 20:40:17.595317  420062 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1217 20:40:17.595346  420062 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1217 20:40:17.595375  420062 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1217 20:40:17.595378  420062 kubeadm.go:319] 
	I1217 20:40:17.599582  420062 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1217 20:40:17.600077  420062 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1217 20:40:17.600181  420062 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1217 20:40:17.600461  420062 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1217 20:40:17.600468  420062 kubeadm.go:319] 
	I1217 20:40:17.600540  420062 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1217 20:40:17.600694  420062 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000220112s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1217 20:40:17.600780  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1217 20:40:18.014309  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1217 20:40:18.029681  420062 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1217 20:40:18.029742  420062 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 20:40:18.038728  420062 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1217 20:40:18.038739  420062 kubeadm.go:158] found existing configuration files:
	
	I1217 20:40:18.038796  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1217 20:40:18.047726  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1217 20:40:18.047785  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1217 20:40:18.056139  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1217 20:40:18.064964  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1217 20:40:18.065020  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 20:40:18.073071  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1217 20:40:18.081347  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1217 20:40:18.081407  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 20:40:18.089386  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1217 20:40:18.097546  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1217 20:40:18.097608  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 20:40:18.105445  420062 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1217 20:40:18.146508  420062 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1217 20:40:18.146883  420062 kubeadm.go:319] [preflight] Running pre-flight checks
	I1217 20:40:18.223079  420062 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1217 20:40:18.223139  420062 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1217 20:40:18.223171  420062 kubeadm.go:319] OS: Linux
	I1217 20:40:18.223212  420062 kubeadm.go:319] CGROUPS_CPU: enabled
	I1217 20:40:18.223257  420062 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1217 20:40:18.223306  420062 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1217 20:40:18.223354  420062 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1217 20:40:18.223398  420062 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1217 20:40:18.223442  420062 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1217 20:40:18.223484  420062 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1217 20:40:18.223529  420062 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1217 20:40:18.223571  420062 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1217 20:40:18.290116  420062 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1217 20:40:18.290214  420062 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1217 20:40:18.290297  420062 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1217 20:40:18.296827  420062 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1217 20:40:18.300313  420062 out.go:252]   - Generating certificates and keys ...
	I1217 20:40:18.300404  420062 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1217 20:40:18.300483  420062 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1217 20:40:18.300564  420062 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1217 20:40:18.300623  420062 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1217 20:40:18.300692  420062 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1217 20:40:18.300745  420062 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1217 20:40:18.300806  420062 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1217 20:40:18.300867  420062 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1217 20:40:18.300940  420062 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1217 20:40:18.301011  420062 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1217 20:40:18.301047  420062 kubeadm.go:319] [certs] Using the existing "sa" key
	I1217 20:40:18.301101  420062 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1217 20:40:18.651136  420062 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1217 20:40:18.865861  420062 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1217 20:40:19.156184  420062 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1217 20:40:19.613234  420062 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1217 20:40:19.777874  420062 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1217 20:40:19.778689  420062 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1217 20:40:19.781521  420062 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1217 20:40:19.784636  420062 out.go:252]   - Booting up control plane ...
	I1217 20:40:19.784726  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1217 20:40:19.784798  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1217 20:40:19.786110  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1217 20:40:19.806173  420062 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1217 20:40:19.806463  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1217 20:40:19.814039  420062 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1217 20:40:19.814294  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1217 20:40:19.814465  420062 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1217 20:40:19.960654  420062 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1217 20:40:19.960777  420062 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1217 20:44:19.954818  420062 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001239508s
	I1217 20:44:19.954843  420062 kubeadm.go:319] 
	I1217 20:44:19.954896  420062 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1217 20:44:19.954927  420062 kubeadm.go:319] 	- The kubelet is not running
	I1217 20:44:19.955102  420062 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1217 20:44:19.955108  420062 kubeadm.go:319] 
	I1217 20:44:19.955205  420062 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1217 20:44:19.955233  420062 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1217 20:44:19.955262  420062 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1217 20:44:19.955265  420062 kubeadm.go:319] 
	I1217 20:44:19.960153  420062 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1217 20:44:19.960582  420062 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1217 20:44:19.960689  420062 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1217 20:44:19.960924  420062 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1217 20:44:19.960929  420062 kubeadm.go:319] 
	I1217 20:44:19.960996  420062 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1217 20:44:19.961048  420062 kubeadm.go:403] duration metric: took 12m9.01968184s to StartCluster
	I1217 20:44:19.961079  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:44:19.961139  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:44:19.999166  420062 cri.go:89] found id: ""
	I1217 20:44:19.999182  420062 logs.go:282] 0 containers: []
	W1217 20:44:19.999190  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:44:19.999195  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:44:19.999265  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:44:20.031203  420062 cri.go:89] found id: ""
	I1217 20:44:20.031218  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.031225  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:44:20.031230  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:44:20.031293  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:44:20.061179  420062 cri.go:89] found id: ""
	I1217 20:44:20.061193  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.061200  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:44:20.061219  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:44:20.061280  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:44:20.089093  420062 cri.go:89] found id: ""
	I1217 20:44:20.089107  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.089114  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:44:20.089120  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:44:20.089183  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:44:20.119683  420062 cri.go:89] found id: ""
	I1217 20:44:20.119696  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.119704  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:44:20.119709  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:44:20.119772  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:44:20.145500  420062 cri.go:89] found id: ""
	I1217 20:44:20.145514  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.145521  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:44:20.145526  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:44:20.145586  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:44:20.170345  420062 cri.go:89] found id: ""
	I1217 20:44:20.170359  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.170367  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:44:20.170377  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:44:20.170387  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:44:20.226476  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:44:20.226496  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:44:20.241970  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:44:20.241987  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:44:20.311525  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:44:20.302109   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.302950   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.304712   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.305374   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.307049   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:44:20.302109   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.302950   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.304712   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.305374   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.307049   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:44:20.311535  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:44:20.311546  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:44:20.375759  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:44:20.375781  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1217 20:44:20.404823  420062 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001239508s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1217 20:44:20.404857  420062 out.go:285] * 
	W1217 20:44:20.404931  420062 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001239508s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1217 20:44:20.404948  420062 out.go:285] * 
	W1217 20:44:20.407052  420062 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1217 20:44:20.412138  420062 out.go:203] 
	W1217 20:44:20.415946  420062 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001239508s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1217 20:44:20.415994  420062 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1217 20:44:20.416018  420062 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1217 20:44:20.419093  420062 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304459447Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304532998Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304632437Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304709099Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304775544Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304836714Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304892469Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304951784Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.305023309Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.305106805Z" level=info msg="Connect containerd service"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.305473562Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.306163145Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.318314045Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.318400322Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.318427285Z" level=info msg="Start subscribing containerd event"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.318481078Z" level=info msg="Start recovering state"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358031279Z" level=info msg="Start event monitor"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358217808Z" level=info msg="Start cni network conf syncer for default"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358291688Z" level=info msg="Start streaming server"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358359291Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358415021Z" level=info msg="runtime interface starting up..."
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358467600Z" level=info msg="starting plugins..."
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358529204Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 17 20:32:09 functional-682596 systemd[1]: Started containerd.service - containerd container runtime.
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.361000854Z" level=info msg="containerd successfully booted in 0.082346s"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:44:23.714623   21317 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:23.715283   21317 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:23.716864   21317 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:23.717401   21317 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:23.718906   21317 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec17 17:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015536] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.514164] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034184] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.806183] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.649674] kauditd_printk_skb: 36 callbacks suppressed
	[Dec17 19:37] hrtimer: interrupt took 15014583 ns
	[Dec17 19:39] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:06] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:17] FS-Cache: Duplicate cookie detected
	[  +0.000767] FS-Cache: O-cookie c=00000031 [p=00000002 fl=222 nc=0 na=1]
	[  +0.001036] FS-Cache: O-cookie d=00000000b1f70094{9P.session} n=000000004124fba5
	[  +0.001177] FS-Cache: O-key=[10] '34323937353834383437'
	[  +0.000816] FS-Cache: N-cookie c=00000032 [p=00000002 fl=2 nc=0 na=1]
	[  +0.001043] FS-Cache: N-cookie d=00000000b1f70094{9P.session} n=000000009cece4cf
	[  +0.001160] FS-Cache: N-key=[10] '34323937353834383437'
	
	
	==> kernel <==
	 20:44:23 up  3:26,  0 user,  load average: 0.56, 0.25, 0.47
	Linux functional-682596 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 17 20:44:20 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:44:21 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 322.
	Dec 17 20:44:21 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:21 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:21 functional-682596 kubelet[21166]: E1217 20:44:21.541596   21166 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:44:21 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:44:21 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:44:22 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 323.
	Dec 17 20:44:22 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:22 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:22 functional-682596 kubelet[21194]: E1217 20:44:22.322384   21194 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:44:22 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:44:22 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:44:22 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 324.
	Dec 17 20:44:22 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:22 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:23 functional-682596 kubelet[21230]: E1217 20:44:23.051841   21230 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:44:23 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:44:23 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:44:23 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 325.
	Dec 17 20:44:23 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:23 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:23 functional-682596 kubelet[21322]: E1217 20:44:23.787250   21322 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:44:23 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:44:23 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596: exit status 2 (361.94758ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-682596" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth (2.16s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/InvalidService (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/InvalidService
functional_test.go:2326: (dbg) Run:  kubectl --context functional-682596 apply -f testdata/invalidsvc.yaml
functional_test.go:2326: (dbg) Non-zero exit: kubectl --context functional-682596 apply -f testdata/invalidsvc.yaml: exit status 1 (62.053945ms)

                                                
                                                
** stderr ** 
	error: error validating "testdata/invalidsvc.yaml": error validating data: failed to download openapi: Get "https://192.168.49.2:8441/openapi/v2?timeout=32s": dial tcp 192.168.49.2:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false

                                                
                                                
** /stderr **
functional_test.go:2328: kubectl --context functional-682596 apply -f testdata/invalidsvc.yaml failed: exit status 1
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/InvalidService (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd (1.92s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd
functional_test.go:920: (dbg) daemon: [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-682596 --alsologtostderr -v=1]
functional_test.go:933: output didn't produce a URL
functional_test.go:925: (dbg) stopping [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-682596 --alsologtostderr -v=1] ...
functional_test.go:925: (dbg) [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-682596 --alsologtostderr -v=1] stdout:
functional_test.go:925: (dbg) [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-682596 --alsologtostderr -v=1] stderr:
I1217 20:46:59.279638  439233 out.go:360] Setting OutFile to fd 1 ...
I1217 20:46:59.279767  439233 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:46:59.279775  439233 out.go:374] Setting ErrFile to fd 2...
I1217 20:46:59.279781  439233 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:46:59.280033  439233 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
I1217 20:46:59.280319  439233 mustload.go:66] Loading cluster: functional-682596
I1217 20:46:59.280754  439233 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1217 20:46:59.281226  439233 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
I1217 20:46:59.299461  439233 host.go:66] Checking if "functional-682596" exists ...
I1217 20:46:59.299794  439233 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1217 20:46:59.366485  439233 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 20:46:59.357222497 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1217 20:46:59.366608  439233 api_server.go:166] Checking apiserver status ...
I1217 20:46:59.366677  439233 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
I1217 20:46:59.366726  439233 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
I1217 20:46:59.384976  439233 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
W1217 20:46:59.481921  439233 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
stdout:

                                                
                                                
stderr:
I1217 20:46:59.485033  439233 out.go:179] * The control-plane node functional-682596 apiserver is not running: (state=Stopped)
I1217 20:46:59.487793  439233 out.go:179]   To start a cluster, run: "minikube start -p functional-682596"
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-682596
helpers_test.go:244: (dbg) docker inspect functional-682596:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	        "Created": "2025-12-17T20:17:26.774929696Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 408854,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-17T20:17:26.844564666Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:2a6398fc76fc21dc0a77ac54600c2604c101bff52e66ecf65f88ec0f1a8cff2d",
	        "ResolvConfPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hostname",
	        "HostsPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hosts",
	        "LogPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77-json.log",
	        "Name": "/functional-682596",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-682596:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-682596",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	                "LowerDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268-init/diff:/var/lib/docker/overlay2/83c8e6311894730d80a5439b5d4991744e9cfa6d0015df9caca346d57baf92e8/diff",
	                "MergedDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/merged",
	                "UpperDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/diff",
	                "WorkDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-682596",
	                "Source": "/var/lib/docker/volumes/functional-682596/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-682596",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-682596",
	                "name.minikube.sigs.k8s.io": "functional-682596",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "8e0f8d4915f888f90df7adb000bd0e749885d304e33053e85751193487b627b9",
	            "SandboxKey": "/var/run/docker/netns/8e0f8d4915f8",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33163"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33164"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33167"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33165"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33166"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-682596": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "de:95:c1:d9:d4:32",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "9e66e4dbc8284f728f81715f37c51d8272e96fcac9fb378874c982b3077b6cc2",
	                    "EndpointID": "0db3c56cfb2be75c981ed53adcc07de7cd33db60d51c01b0e875c8d41cf02897",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-682596",
	                        "efc9468a7e55"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596: exit status 2 (305.463951ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd logs: 
-- stdout --
	
	==> Audit <==
	┌───────────┬────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│  COMMAND  │                                                                        ARGS                                                                        │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├───────────┼────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ addons    │ functional-682596 addons list -o json                                                                                                              │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │ 17 Dec 25 20:46 UTC │
	│ mount     │ -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3220205391/001:/mount-9p --alsologtostderr -v=1             │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ ssh       │ functional-682596 ssh findmnt -T /mount-9p | grep 9p                                                                                               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ ssh       │ functional-682596 ssh findmnt -T /mount-9p | grep 9p                                                                                               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │ 17 Dec 25 20:46 UTC │
	│ ssh       │ functional-682596 ssh -- ls -la /mount-9p                                                                                                          │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │ 17 Dec 25 20:46 UTC │
	│ ssh       │ functional-682596 ssh cat /mount-9p/test-1766004412662439027                                                                                       │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │ 17 Dec 25 20:46 UTC │
	│ ssh       │ functional-682596 ssh mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates                                                                   │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ ssh       │ functional-682596 ssh sudo umount -f /mount-9p                                                                                                     │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │ 17 Dec 25 20:46 UTC │
	│ ssh       │ functional-682596 ssh findmnt -T /mount-9p | grep 9p                                                                                               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ mount     │ -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun793336896/001:/mount-9p --alsologtostderr -v=1 --port 46464 │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ ssh       │ functional-682596 ssh findmnt -T /mount-9p | grep 9p                                                                                               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │ 17 Dec 25 20:46 UTC │
	│ ssh       │ functional-682596 ssh -- ls -la /mount-9p                                                                                                          │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │ 17 Dec 25 20:46 UTC │
	│ ssh       │ functional-682596 ssh sudo umount -f /mount-9p                                                                                                     │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ mount     │ -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2013412110/001:/mount1 --alsologtostderr -v=1               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ mount     │ -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2013412110/001:/mount2 --alsologtostderr -v=1               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ mount     │ -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2013412110/001:/mount3 --alsologtostderr -v=1               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ ssh       │ functional-682596 ssh findmnt -T /mount1                                                                                                           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ ssh       │ functional-682596 ssh findmnt -T /mount1                                                                                                           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │ 17 Dec 25 20:46 UTC │
	│ ssh       │ functional-682596 ssh findmnt -T /mount2                                                                                                           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │ 17 Dec 25 20:46 UTC │
	│ ssh       │ functional-682596 ssh findmnt -T /mount3                                                                                                           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │ 17 Dec 25 20:46 UTC │
	│ mount     │ -p functional-682596 --kill=true                                                                                                                   │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ start     │ -p functional-682596 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1  │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ start     │ -p functional-682596 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1            │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ start     │ -p functional-682596 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1  │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ dashboard │ --url --port 36195 -p functional-682596 --alsologtostderr -v=1                                                                                     │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	└───────────┴────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/17 20:46:59
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1217 20:46:59.081910  439189 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:46:59.082355  439189 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:46:59.082395  439189 out.go:374] Setting ErrFile to fd 2...
	I1217 20:46:59.082418  439189 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:46:59.082870  439189 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:46:59.083356  439189 out.go:368] Setting JSON to false
	I1217 20:46:59.084244  439189 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":12564,"bootTime":1765991855,"procs":158,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:46:59.084377  439189 start.go:143] virtualization:  
	I1217 20:46:59.087733  439189 out.go:179] * [functional-682596] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	I1217 20:46:59.091444  439189 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 20:46:59.091535  439189 notify.go:221] Checking for updates...
	I1217 20:46:59.097262  439189 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:46:59.100117  439189 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:46:59.102903  439189 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:46:59.105734  439189 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 20:46:59.108516  439189 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 20:46:59.111896  439189 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:46:59.112611  439189 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:46:59.134996  439189 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:46:59.135121  439189 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:46:59.205094  439189 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 20:46:59.195937607 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:46:59.205199  439189 docker.go:319] overlay module found
	I1217 20:46:59.208339  439189 out.go:179] * Utilisation du pilote docker basé sur le profil existant
	I1217 20:46:59.211199  439189 start.go:309] selected driver: docker
	I1217 20:46:59.211235  439189 start.go:927] validating driver "docker" against &{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUI
D:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:46:59.211332  439189 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 20:46:59.214852  439189 out.go:203] 
	W1217 20:46:59.217732  439189 out.go:285] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I1217 20:46:59.220709  439189 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 17 20:44:29 functional-682596 containerd[9792]: time="2025-12-17T20:44:29.485700024Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-682596\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.528847726Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\""
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.532364395Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.541542678Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.562388570Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\" returns successfully"
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.886013953Z" level=info msg="No images store for sha256:426b8c85f3639ce7684f335da56e517a857cd0c0b418e28f3fce1e3079a57b26"
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.888517483Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.896652382Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.896979157Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-682596\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.215090972Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\""
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.217900174Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.221232159Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.234208610Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\" returns successfully"
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.526440562Z" level=info msg="No images store for sha256:426b8c85f3639ce7684f335da56e517a857cd0c0b418e28f3fce1e3079a57b26"
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.528721959Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.537888598Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.538209006Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-682596\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:33 functional-682596 containerd[9792]: time="2025-12-17T20:44:33.567338451Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\""
	Dec 17 20:44:33 functional-682596 containerd[9792]: time="2025-12-17T20:44:33.569906392Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:33 functional-682596 containerd[9792]: time="2025-12-17T20:44:33.572864134Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 17 20:44:33 functional-682596 containerd[9792]: time="2025-12-17T20:44:33.580393179Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\" returns successfully"
	Dec 17 20:44:34 functional-682596 containerd[9792]: time="2025-12-17T20:44:34.405679689Z" level=info msg="No images store for sha256:05371fd6ad950eede907960b388fa9b50b39adf62f93dec0b13c9fc4ce7e1bc1"
	Dec 17 20:44:34 functional-682596 containerd[9792]: time="2025-12-17T20:44:34.408072965Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:34 functional-682596 containerd[9792]: time="2025-12-17T20:44:34.415708801Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:34 functional-682596 containerd[9792]: time="2025-12-17T20:44:34.416196737Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-682596\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:47:00.748158   24186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:47:00.748588   24186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:47:00.750084   24186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:47:00.750433   24186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:47:00.751885   24186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec17 17:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015536] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.514164] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034184] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.806183] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.649674] kauditd_printk_skb: 36 callbacks suppressed
	[Dec17 19:37] hrtimer: interrupt took 15014583 ns
	[Dec17 19:39] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:06] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:17] FS-Cache: Duplicate cookie detected
	[  +0.000767] FS-Cache: O-cookie c=00000031 [p=00000002 fl=222 nc=0 na=1]
	[  +0.001036] FS-Cache: O-cookie d=00000000b1f70094{9P.session} n=000000004124fba5
	[  +0.001177] FS-Cache: O-key=[10] '34323937353834383437'
	[  +0.000816] FS-Cache: N-cookie c=00000032 [p=00000002 fl=2 nc=0 na=1]
	[  +0.001043] FS-Cache: N-cookie d=00000000b1f70094{9P.session} n=000000009cece4cf
	[  +0.001160] FS-Cache: N-key=[10] '34323937353834383437'
	
	
	==> kernel <==
	 20:47:00 up  3:29,  0 user,  load average: 1.48, 0.70, 0.60
	Linux functional-682596 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 17 20:46:57 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:46:57 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 530.
	Dec 17 20:46:57 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:57 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:58 functional-682596 kubelet[24046]: E1217 20:46:58.050938   24046 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:46:58 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:46:58 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:46:58 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 531.
	Dec 17 20:46:58 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:58 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:58 functional-682596 kubelet[24068]: E1217 20:46:58.801802   24068 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:46:58 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:46:58 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:46:59 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 532.
	Dec 17 20:46:59 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:59 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:59 functional-682596 kubelet[24083]: E1217 20:46:59.546129   24083 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:46:59 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:46:59 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:47:00 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 533.
	Dec 17 20:47:00 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:47:00 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:47:00 functional-682596 kubelet[24104]: E1217 20:47:00.353494   24104 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:47:00 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:47:00 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596: exit status 2 (305.467918ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-682596" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd (1.92s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd (3.1s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd
functional_test.go:869: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 status
functional_test.go:869: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 status: exit status 2 (307.504708ms)

                                                
                                                
-- stdout --
	functional-682596
	type: Control Plane
	host: Running
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Configured
	

                                                
                                                
-- /stdout --
functional_test.go:871: failed to run minikube status. args "out/minikube-linux-arm64 -p functional-682596 status" : exit status 2
functional_test.go:875: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:875: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}: exit status 2 (331.337499ms)

                                                
                                                
-- stdout --
	host:Running,kublet:Running,apiserver:Stopped,kubeconfig:Configured

                                                
                                                
-- /stdout --
functional_test.go:877: failed to run minikube status with custom format: args "out/minikube-linux-arm64 -p functional-682596 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}": exit status 2
functional_test.go:887: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 status -o json
functional_test.go:887: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 status -o json: exit status 2 (307.81305ms)

                                                
                                                
-- stdout --
	{"Name":"functional-682596","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
functional_test.go:889: failed to run minikube status with json output. args "out/minikube-linux-arm64 -p functional-682596 status -o json" : exit status 2
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-682596
helpers_test.go:244: (dbg) docker inspect functional-682596:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	        "Created": "2025-12-17T20:17:26.774929696Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 408854,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-17T20:17:26.844564666Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:2a6398fc76fc21dc0a77ac54600c2604c101bff52e66ecf65f88ec0f1a8cff2d",
	        "ResolvConfPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hostname",
	        "HostsPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hosts",
	        "LogPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77-json.log",
	        "Name": "/functional-682596",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-682596:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-682596",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	                "LowerDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268-init/diff:/var/lib/docker/overlay2/83c8e6311894730d80a5439b5d4991744e9cfa6d0015df9caca346d57baf92e8/diff",
	                "MergedDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/merged",
	                "UpperDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/diff",
	                "WorkDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-682596",
	                "Source": "/var/lib/docker/volumes/functional-682596/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-682596",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-682596",
	                "name.minikube.sigs.k8s.io": "functional-682596",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "8e0f8d4915f888f90df7adb000bd0e749885d304e33053e85751193487b627b9",
	            "SandboxKey": "/var/run/docker/netns/8e0f8d4915f8",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33163"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33164"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33167"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33165"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33166"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-682596": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "de:95:c1:d9:d4:32",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "9e66e4dbc8284f728f81715f37c51d8272e96fcac9fb378874c982b3077b6cc2",
	                    "EndpointID": "0db3c56cfb2be75c981ed53adcc07de7cd33db60d51c01b0e875c8d41cf02897",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-682596",
	                        "efc9468a7e55"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596: exit status 2 (288.513089ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh     │ functional-682596 ssh sudo cat /usr/share/ca-certificates/369461.pem                                                                                            │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ ssh     │ functional-682596 ssh sudo cat /etc/ssl/certs/51391683.0                                                                                                        │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ image   │ functional-682596 image ls                                                                                                                                      │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ ssh     │ functional-682596 ssh sudo cat /etc/ssl/certs/3694612.pem                                                                                                       │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ image   │ functional-682596 image save kicbase/echo-server:functional-682596 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ ssh     │ functional-682596 ssh sudo cat /usr/share/ca-certificates/3694612.pem                                                                                           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ image   │ functional-682596 image rm kicbase/echo-server:functional-682596 --alsologtostderr                                                                              │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ ssh     │ functional-682596 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                        │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ image   │ functional-682596 image ls                                                                                                                                      │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ ssh     │ functional-682596 ssh sudo cat /etc/test/nested/copy/369461/hosts                                                                                               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ image   │ functional-682596 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ service │ functional-682596 service list                                                                                                                                  │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ image   │ functional-682596 image ls                                                                                                                                      │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ service │ functional-682596 service list -o json                                                                                                                          │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ image   │ functional-682596 image save --daemon kicbase/echo-server:functional-682596 --alsologtostderr                                                                   │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ service │ functional-682596 service --namespace=default --https --url hello-node                                                                                          │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ ssh     │ functional-682596 ssh echo hello                                                                                                                                │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ service │ functional-682596 service hello-node --url --format={{.IP}}                                                                                                     │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ ssh     │ functional-682596 ssh cat /etc/hostname                                                                                                                         │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ service │ functional-682596 service hello-node --url                                                                                                                      │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ tunnel  │ functional-682596 tunnel --alsologtostderr                                                                                                                      │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ tunnel  │ functional-682596 tunnel --alsologtostderr                                                                                                                      │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ tunnel  │ functional-682596 tunnel --alsologtostderr                                                                                                                      │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ addons  │ functional-682596 addons list                                                                                                                                   │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │ 17 Dec 25 20:46 UTC │
	│ addons  │ functional-682596 addons list -o json                                                                                                                           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │ 17 Dec 25 20:46 UTC │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/17 20:32:06
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1217 20:32:06.395598  420062 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:32:06.395704  420062 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:32:06.395708  420062 out.go:374] Setting ErrFile to fd 2...
	I1217 20:32:06.395712  420062 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:32:06.395972  420062 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:32:06.396388  420062 out.go:368] Setting JSON to false
	I1217 20:32:06.397206  420062 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":11672,"bootTime":1765991855,"procs":154,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:32:06.397266  420062 start.go:143] virtualization:  
	I1217 20:32:06.400889  420062 out.go:179] * [functional-682596] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1217 20:32:06.403953  420062 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 20:32:06.404019  420062 notify.go:221] Checking for updates...
	I1217 20:32:06.410244  420062 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:32:06.413231  420062 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:32:06.416152  420062 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:32:06.419145  420062 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 20:32:06.422186  420062 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 20:32:06.425355  420062 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:32:06.425444  420062 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:32:06.459431  420062 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:32:06.459555  420062 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:32:06.531840  420062 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-17 20:32:06.520070933 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:32:06.531937  420062 docker.go:319] overlay module found
	I1217 20:32:06.535075  420062 out.go:179] * Using the docker driver based on existing profile
	I1217 20:32:06.538013  420062 start.go:309] selected driver: docker
	I1217 20:32:06.538025  420062 start.go:927] validating driver "docker" against &{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableC
oreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:32:06.538123  420062 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 20:32:06.538239  420062 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:32:06.599898  420062 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-17 20:32:06.590438982 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:32:06.600362  420062 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1217 20:32:06.600387  420062 cni.go:84] Creating CNI manager for ""
	I1217 20:32:06.600439  420062 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:32:06.600480  420062 start.go:353] cluster config:
	{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCo
reDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:32:06.605529  420062 out.go:179] * Starting "functional-682596" primary control-plane node in "functional-682596" cluster
	I1217 20:32:06.608314  420062 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1217 20:32:06.611190  420062 out.go:179] * Pulling base image v0.0.48-1765661130-22141 ...
	I1217 20:32:06.614228  420062 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:32:06.614282  420062 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1217 20:32:06.614283  420062 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon
	I1217 20:32:06.614291  420062 cache.go:65] Caching tarball of preloaded images
	I1217 20:32:06.614394  420062 preload.go:238] Found /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1217 20:32:06.614404  420062 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1217 20:32:06.614527  420062 profile.go:143] Saving config to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/config.json ...
	I1217 20:32:06.634867  420062 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon, skipping pull
	I1217 20:32:06.634879  420062 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 exists in daemon, skipping load
	I1217 20:32:06.634892  420062 cache.go:243] Successfully downloaded all kic artifacts
	I1217 20:32:06.634927  420062 start.go:360] acquireMachinesLock for functional-682596: {Name:mk49b95a4c72eb2d15a1ae0f35918a9843d0b3df Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1217 20:32:06.634983  420062 start.go:364] duration metric: took 39.828µs to acquireMachinesLock for "functional-682596"
	I1217 20:32:06.635002  420062 start.go:96] Skipping create...Using existing machine configuration
	I1217 20:32:06.635007  420062 fix.go:54] fixHost starting: 
	I1217 20:32:06.635262  420062 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:32:06.652755  420062 fix.go:112] recreateIfNeeded on functional-682596: state=Running err=<nil>
	W1217 20:32:06.652776  420062 fix.go:138] unexpected machine state, will restart: <nil>
	I1217 20:32:06.656001  420062 out.go:252] * Updating the running docker "functional-682596" container ...
	I1217 20:32:06.656027  420062 machine.go:94] provisionDockerMachine start ...
	I1217 20:32:06.656117  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:06.673371  420062 main.go:143] libmachine: Using SSH client type: native
	I1217 20:32:06.673711  420062 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:32:06.673717  420062 main.go:143] libmachine: About to run SSH command:
	hostname
	I1217 20:32:06.807817  420062 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:32:06.807832  420062 ubuntu.go:182] provisioning hostname "functional-682596"
	I1217 20:32:06.807905  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:06.825970  420062 main.go:143] libmachine: Using SSH client type: native
	I1217 20:32:06.826266  420062 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:32:06.826274  420062 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-682596 && echo "functional-682596" | sudo tee /etc/hostname
	I1217 20:32:06.965026  420062 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:32:06.965108  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:06.983394  420062 main.go:143] libmachine: Using SSH client type: native
	I1217 20:32:06.983695  420062 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:32:06.983710  420062 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-682596' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-682596/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-682596' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1217 20:32:07.116833  420062 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1217 20:32:07.116850  420062 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21808-367595/.minikube CaCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21808-367595/.minikube}
	I1217 20:32:07.116869  420062 ubuntu.go:190] setting up certificates
	I1217 20:32:07.116877  420062 provision.go:84] configureAuth start
	I1217 20:32:07.116947  420062 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:32:07.134531  420062 provision.go:143] copyHostCerts
	I1217 20:32:07.134601  420062 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem, removing ...
	I1217 20:32:07.134608  420062 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem
	I1217 20:32:07.134696  420062 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem (1082 bytes)
	I1217 20:32:07.134816  420062 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem, removing ...
	I1217 20:32:07.134820  420062 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem
	I1217 20:32:07.134849  420062 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem (1123 bytes)
	I1217 20:32:07.134907  420062 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem, removing ...
	I1217 20:32:07.134911  420062 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem
	I1217 20:32:07.134937  420062 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem (1679 bytes)
	I1217 20:32:07.134994  420062 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem org=jenkins.functional-682596 san=[127.0.0.1 192.168.49.2 functional-682596 localhost minikube]
	I1217 20:32:07.402222  420062 provision.go:177] copyRemoteCerts
	I1217 20:32:07.402275  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1217 20:32:07.402313  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.421789  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.516787  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1217 20:32:07.535734  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1217 20:32:07.553569  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1217 20:32:07.572193  420062 provision.go:87] duration metric: took 455.301945ms to configureAuth
	I1217 20:32:07.572211  420062 ubuntu.go:206] setting minikube options for container-runtime
	I1217 20:32:07.572513  420062 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:32:07.572520  420062 machine.go:97] duration metric: took 916.488302ms to provisionDockerMachine
	I1217 20:32:07.572527  420062 start.go:293] postStartSetup for "functional-682596" (driver="docker")
	I1217 20:32:07.572544  420062 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1217 20:32:07.572595  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1217 20:32:07.572635  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.593078  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.688373  420062 ssh_runner.go:195] Run: cat /etc/os-release
	I1217 20:32:07.691957  420062 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1217 20:32:07.691978  420062 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1217 20:32:07.691989  420062 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/addons for local assets ...
	I1217 20:32:07.692044  420062 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/files for local assets ...
	I1217 20:32:07.692122  420062 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> 3694612.pem in /etc/ssl/certs
	I1217 20:32:07.692197  420062 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts -> hosts in /etc/test/nested/copy/369461
	I1217 20:32:07.692238  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/369461
	I1217 20:32:07.699873  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:32:07.718147  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts --> /etc/test/nested/copy/369461/hosts (40 bytes)
	I1217 20:32:07.736089  420062 start.go:296] duration metric: took 163.546649ms for postStartSetup
	I1217 20:32:07.736163  420062 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1217 20:32:07.736210  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.753837  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.845496  420062 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1217 20:32:07.850448  420062 fix.go:56] duration metric: took 1.215434362s for fixHost
	I1217 20:32:07.850463  420062 start.go:83] releasing machines lock for "functional-682596", held for 1.215473649s
	I1217 20:32:07.850551  420062 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:32:07.871450  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:32:07.871498  420062 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:32:07.871505  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:32:07.871531  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:32:07.871602  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:32:07.871627  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:32:07.871680  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:32:07.871748  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:32:07.871798  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.889554  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.998672  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:32:08.024673  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:32:08.048014  420062 ssh_runner.go:195] Run: openssl version
	I1217 20:32:08.055454  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.065155  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:32:08.073391  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.077720  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.077778  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.119356  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:32:08.127518  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.135465  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:32:08.143207  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.147322  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.147376  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.188376  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:32:08.196028  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.203401  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:32:08.211111  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.214821  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.214891  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.256072  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:32:08.263331  420062 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-certificates >/dev/null 2>&1 && sudo update-ca-certificates || true"
	I1217 20:32:08.266724  420062 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-trust >/dev/null 2>&1 && sudo update-ca-trust extract || true"
	I1217 20:32:08.270040  420062 ssh_runner.go:195] Run: cat /version.json
	I1217 20:32:08.270111  420062 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1217 20:32:08.361093  420062 ssh_runner.go:195] Run: systemctl --version
	I1217 20:32:08.367706  420062 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1217 20:32:08.372063  420062 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1217 20:32:08.372127  420062 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1217 20:32:08.380119  420062 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1217 20:32:08.380133  420062 start.go:496] detecting cgroup driver to use...
	I1217 20:32:08.380163  420062 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1217 20:32:08.380223  420062 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1217 20:32:08.395765  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1217 20:32:08.409064  420062 docker.go:218] disabling cri-docker service (if available) ...
	I1217 20:32:08.409142  420062 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1217 20:32:08.425141  420062 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1217 20:32:08.438808  420062 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1217 20:32:08.558555  420062 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1217 20:32:08.681937  420062 docker.go:234] disabling docker service ...
	I1217 20:32:08.681997  420062 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1217 20:32:08.701323  420062 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1217 20:32:08.715923  420062 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1217 20:32:08.835610  420062 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1217 20:32:08.958372  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1217 20:32:08.972822  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1217 20:32:08.987570  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1217 20:32:08.997169  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1217 20:32:09.008742  420062 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1217 20:32:09.008821  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1217 20:32:09.018997  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:32:09.028318  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1217 20:32:09.037280  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:32:09.046375  420062 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1217 20:32:09.054925  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1217 20:32:09.064191  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1217 20:32:09.073303  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1217 20:32:09.082553  420062 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1217 20:32:09.090003  420062 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1217 20:32:09.097524  420062 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:32:09.216967  420062 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1217 20:32:09.360558  420062 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1217 20:32:09.360617  420062 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1217 20:32:09.364443  420062 start.go:564] Will wait 60s for crictl version
	I1217 20:32:09.364497  420062 ssh_runner.go:195] Run: which crictl
	I1217 20:32:09.368129  420062 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1217 20:32:09.397262  420062 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1217 20:32:09.397334  420062 ssh_runner.go:195] Run: containerd --version
	I1217 20:32:09.420778  420062 ssh_runner.go:195] Run: containerd --version
	I1217 20:32:09.446347  420062 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1217 20:32:09.449338  420062 cli_runner.go:164] Run: docker network inspect functional-682596 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1217 20:32:09.466521  420062 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1217 20:32:09.473221  420062 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1217 20:32:09.476024  420062 kubeadm.go:884] updating cluster {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMi
rror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1217 20:32:09.476173  420062 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:32:09.476285  420062 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:32:09.523837  420062 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:32:09.523848  420062 containerd.go:534] Images already preloaded, skipping extraction
	I1217 20:32:09.523905  420062 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:32:09.551003  420062 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:32:09.551014  420062 cache_images.go:86] Images are preloaded, skipping loading
	I1217 20:32:09.551021  420062 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1217 20:32:09.551143  420062 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-682596 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1217 20:32:09.551208  420062 ssh_runner.go:195] Run: sudo crictl info
	I1217 20:32:09.578643  420062 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1217 20:32:09.578665  420062 cni.go:84] Creating CNI manager for ""
	I1217 20:32:09.578673  420062 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:32:09.578683  420062 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1217 20:32:09.578707  420062 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-682596 NodeName:functional-682596 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubelet
ConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1217 20:32:09.578827  420062 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-682596"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1217 20:32:09.578904  420062 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1217 20:32:09.586879  420062 binaries.go:51] Found k8s binaries, skipping transfer
	I1217 20:32:09.586939  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1217 20:32:09.594505  420062 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1217 20:32:09.607281  420062 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1217 20:32:09.619808  420062 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2085 bytes)
	I1217 20:32:09.632685  420062 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1217 20:32:09.636364  420062 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:32:09.746796  420062 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1217 20:32:10.238623  420062 certs.go:69] Setting up /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596 for IP: 192.168.49.2
	I1217 20:32:10.238634  420062 certs.go:195] generating shared ca certs ...
	I1217 20:32:10.238650  420062 certs.go:227] acquiring lock for ca certs: {Name:mk528c7ee25f2f3d78de33f266a77f908cb2a9d0 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:32:10.238819  420062 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key
	I1217 20:32:10.238897  420062 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key
	I1217 20:32:10.238904  420062 certs.go:257] generating profile certs ...
	I1217 20:32:10.238995  420062 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key
	I1217 20:32:10.239044  420062 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key.0c30bf8d
	I1217 20:32:10.239082  420062 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key
	I1217 20:32:10.239190  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:32:10.239221  420062 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:32:10.239227  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:32:10.239261  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:32:10.239282  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:32:10.239304  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:32:10.239345  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:32:10.239934  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1217 20:32:10.261870  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1217 20:32:10.286466  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1217 20:32:10.307033  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1217 20:32:10.325172  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1217 20:32:10.343499  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1217 20:32:10.361814  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1217 20:32:10.379595  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1217 20:32:10.397590  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:32:10.415855  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:32:10.435021  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:32:10.453267  420062 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1217 20:32:10.466474  420062 ssh_runner.go:195] Run: openssl version
	I1217 20:32:10.472863  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.480366  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:32:10.487904  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.491724  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.491791  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.533110  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:32:10.540758  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.548093  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:32:10.555384  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.558983  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.559039  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.602447  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:32:10.609962  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.617251  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:32:10.625102  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.629186  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.629244  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.670572  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:32:10.678295  420062 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1217 20:32:10.682347  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1217 20:32:10.723286  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1217 20:32:10.764614  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1217 20:32:10.806369  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1217 20:32:10.856829  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1217 20:32:10.900136  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1217 20:32:10.941380  420062 kubeadm.go:401] StartCluster: {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirro
r: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:32:10.941458  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1217 20:32:10.941532  420062 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1217 20:32:10.973304  420062 cri.go:89] found id: ""
	I1217 20:32:10.973369  420062 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1217 20:32:10.981213  420062 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1217 20:32:10.981233  420062 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1217 20:32:10.981284  420062 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1217 20:32:10.989643  420062 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:10.990148  420062 kubeconfig.go:125] found "functional-682596" server: "https://192.168.49.2:8441"
	I1217 20:32:10.991404  420062 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1217 20:32:11.001770  420062 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-17 20:17:35.203485302 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-17 20:32:09.624537089 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1217 20:32:11.001793  420062 kubeadm.go:1161] stopping kube-system containers ...
	I1217 20:32:11.001810  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1217 20:32:11.001907  420062 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1217 20:32:11.031815  420062 cri.go:89] found id: ""
	I1217 20:32:11.031894  420062 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1217 20:32:11.052689  420062 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 20:32:11.061497  420062 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5635 Dec 17 20:21 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5640 Dec 17 20:21 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Dec 17 20:21 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Dec 17 20:21 /etc/kubernetes/scheduler.conf
	
	I1217 20:32:11.061561  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1217 20:32:11.069861  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1217 20:32:11.077903  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:11.077964  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 20:32:11.085969  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1217 20:32:11.094098  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:11.094177  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 20:32:11.102002  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1217 20:32:11.110213  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:11.110288  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 20:32:11.119148  420062 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1217 20:32:11.127567  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:11.176595  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.173518  420062 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.996897383s)
	I1217 20:32:13.173578  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.380045  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.450955  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.494559  420062 api_server.go:52] waiting for apiserver process to appear ...
	I1217 20:32:13.494629  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:13.995499  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:14.495246  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:14.995004  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:15.494932  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:15.995036  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:16.495074  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:16.994872  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:17.495380  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:17.995751  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:18.495343  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:18.994970  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:19.494770  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:19.994830  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:20.495505  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:20.994898  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:21.495023  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:21.995478  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:22.495349  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:22.995690  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:23.495439  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:23.995543  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:24.495694  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:24.995422  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:25.495295  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:25.994704  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:26.495710  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:26.995337  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:27.494832  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:27.995523  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:28.494851  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:28.995537  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:29.495464  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:29.994938  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:30.494723  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:30.995506  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:31.494922  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:31.995021  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:32.495513  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:32.995616  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:33.494819  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:33.995255  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:34.495487  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:34.994841  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:35.494829  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:35.994738  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:36.495064  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:36.995222  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:37.495670  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:37.995598  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:38.495022  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:38.994778  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:39.494800  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:39.995546  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:40.495339  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:40.995490  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:41.495730  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:41.995344  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:42.494837  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:42.994782  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:43.495499  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:43.994789  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:44.495147  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:44.994920  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:45.495463  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:45.994922  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:46.495042  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:46.994829  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:47.495629  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:47.994850  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:48.495359  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:48.994705  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:49.494785  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:49.995746  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:50.495699  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:50.994838  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:51.494890  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:51.995223  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:52.495608  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:52.995342  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:53.495633  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:53.994828  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:54.495690  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:54.995411  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:55.495390  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:55.994857  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:56.494814  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:56.995195  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:57.494792  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:57.995068  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:58.494828  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:58.995135  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:59.495101  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:59.994696  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:00.494847  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:00.994832  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:01.495150  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:01.994869  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:02.494983  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:02.995441  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:03.495150  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:03.994800  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:04.494955  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:04.995595  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:05.495571  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:05.995745  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:06.494913  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:06.994802  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:07.494809  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:07.995731  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:08.495034  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:08.995352  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:09.494830  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:09.995574  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:10.495663  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:10.995478  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:11.494754  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:11.995704  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:12.494787  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:12.995364  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:13.495637  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:13.495716  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:13.520703  420062 cri.go:89] found id: ""
	I1217 20:33:13.520717  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.520724  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:13.520729  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:13.520793  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:13.549658  420062 cri.go:89] found id: ""
	I1217 20:33:13.549672  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.549680  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:13.549685  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:13.549748  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:13.574860  420062 cri.go:89] found id: ""
	I1217 20:33:13.574873  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.574880  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:13.574885  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:13.574945  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:13.602159  420062 cri.go:89] found id: ""
	I1217 20:33:13.602173  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.602180  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:13.602185  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:13.602244  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:13.625735  420062 cri.go:89] found id: ""
	I1217 20:33:13.625748  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.625755  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:13.625760  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:13.625816  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:13.650446  420062 cri.go:89] found id: ""
	I1217 20:33:13.650460  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.650468  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:13.650473  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:13.650533  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:13.677915  420062 cri.go:89] found id: ""
	I1217 20:33:13.677929  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.677936  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:13.677944  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:13.677954  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:13.692434  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:13.692449  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:13.767790  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:13.758832   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.759470   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.761607   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.762393   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.763960   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:13.758832   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.759470   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.761607   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.762393   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.763960   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:13.767810  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:13.767820  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:13.839665  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:13.839685  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:13.872573  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:13.872589  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:16.429115  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:16.438989  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:16.439051  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:16.466518  420062 cri.go:89] found id: ""
	I1217 20:33:16.466532  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.466539  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:16.466545  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:16.466602  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:16.492200  420062 cri.go:89] found id: ""
	I1217 20:33:16.492213  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.492221  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:16.492226  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:16.492302  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:16.517055  420062 cri.go:89] found id: ""
	I1217 20:33:16.517070  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.517083  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:16.517088  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:16.517148  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:16.552138  420062 cri.go:89] found id: ""
	I1217 20:33:16.552152  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.552159  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:16.552165  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:16.552235  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:16.577184  420062 cri.go:89] found id: ""
	I1217 20:33:16.577198  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.577214  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:16.577220  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:16.577279  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:16.602039  420062 cri.go:89] found id: ""
	I1217 20:33:16.602053  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.602060  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:16.602066  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:16.602124  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:16.626732  420062 cri.go:89] found id: ""
	I1217 20:33:16.626745  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.626752  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:16.626760  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:16.626770  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:16.689454  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:16.689473  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:16.722345  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:16.722363  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:16.784686  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:16.784705  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:16.801895  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:16.801911  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:16.865697  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:16.856899   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.857554   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859279   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859924   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.861707   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:16.856899   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.857554   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859279   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859924   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.861707   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:19.365915  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:19.375998  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:19.376066  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:19.399955  420062 cri.go:89] found id: ""
	I1217 20:33:19.399968  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.399976  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:19.399981  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:19.400039  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:19.424668  420062 cri.go:89] found id: ""
	I1217 20:33:19.424682  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.424689  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:19.424695  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:19.424755  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:19.449865  420062 cri.go:89] found id: ""
	I1217 20:33:19.449879  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.449886  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:19.449891  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:19.449958  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:19.474803  420062 cri.go:89] found id: ""
	I1217 20:33:19.474816  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.474833  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:19.474838  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:19.474909  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:19.503551  420062 cri.go:89] found id: ""
	I1217 20:33:19.503579  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.503598  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:19.503603  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:19.503687  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:19.529232  420062 cri.go:89] found id: ""
	I1217 20:33:19.529246  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.529259  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:19.529264  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:19.529330  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:19.554443  420062 cri.go:89] found id: ""
	I1217 20:33:19.554456  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.554463  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:19.554481  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:19.554491  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:19.609391  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:19.609411  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:19.625653  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:19.625669  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:19.691445  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:19.683737   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.684184   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.685768   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.686180   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.687608   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:19.683737   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.684184   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.685768   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.686180   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.687608   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:19.691456  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:19.691466  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:19.754663  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:19.754682  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:22.297725  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:22.309139  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:22.309199  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:22.334369  420062 cri.go:89] found id: ""
	I1217 20:33:22.334382  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.334390  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:22.334395  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:22.334458  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:22.363418  420062 cri.go:89] found id: ""
	I1217 20:33:22.363445  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.363453  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:22.363458  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:22.363531  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:22.388924  420062 cri.go:89] found id: ""
	I1217 20:33:22.388939  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.388947  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:22.388993  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:22.389056  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:22.415757  420062 cri.go:89] found id: ""
	I1217 20:33:22.415780  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.415787  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:22.415793  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:22.415872  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:22.441520  420062 cri.go:89] found id: ""
	I1217 20:33:22.441534  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.441541  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:22.441546  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:22.441605  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:22.480775  420062 cri.go:89] found id: ""
	I1217 20:33:22.480789  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.480795  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:22.480801  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:22.480873  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:22.505556  420062 cri.go:89] found id: ""
	I1217 20:33:22.505570  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.505577  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:22.505585  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:22.505596  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:22.562036  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:22.562054  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:22.577369  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:22.577386  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:22.647423  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:22.638838   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.639486   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641272   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641956   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.643602   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:22.638838   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.639486   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641272   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641956   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.643602   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:22.647453  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:22.647464  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:22.710153  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:22.710173  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:25.239783  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:25.250945  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:25.251006  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:25.277422  420062 cri.go:89] found id: ""
	I1217 20:33:25.277435  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.277443  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:25.277448  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:25.277510  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:25.303032  420062 cri.go:89] found id: ""
	I1217 20:33:25.303051  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.303063  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:25.303070  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:25.303176  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:25.333183  420062 cri.go:89] found id: ""
	I1217 20:33:25.333197  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.333204  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:25.333209  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:25.333272  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:25.358899  420062 cri.go:89] found id: ""
	I1217 20:33:25.358913  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.358920  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:25.358926  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:25.358986  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:25.388611  420062 cri.go:89] found id: ""
	I1217 20:33:25.388625  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.388633  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:25.388638  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:25.388704  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:25.415829  420062 cri.go:89] found id: ""
	I1217 20:33:25.415844  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.415852  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:25.415857  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:25.415913  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:25.442921  420062 cri.go:89] found id: ""
	I1217 20:33:25.442935  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.442941  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:25.442949  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:25.442965  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:25.459113  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:25.459135  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:25.535629  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:25.526636   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.527172   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.528838   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.529443   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.530989   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:25.526636   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.527172   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.528838   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.529443   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.530989   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:25.535645  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:25.535655  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:25.601950  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:25.601968  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:25.634192  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:25.634208  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:28.190569  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:28.200504  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:28.200563  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:28.224311  420062 cri.go:89] found id: ""
	I1217 20:33:28.224325  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.224332  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:28.224338  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:28.224396  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:28.252603  420062 cri.go:89] found id: ""
	I1217 20:33:28.252622  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.252629  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:28.252634  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:28.252692  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:28.276684  420062 cri.go:89] found id: ""
	I1217 20:33:28.276697  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.276704  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:28.276709  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:28.276777  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:28.299922  420062 cri.go:89] found id: ""
	I1217 20:33:28.299935  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.299942  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:28.299947  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:28.300014  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:28.326124  420062 cri.go:89] found id: ""
	I1217 20:33:28.326137  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.326144  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:28.326150  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:28.326218  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:28.349497  420062 cri.go:89] found id: ""
	I1217 20:33:28.349510  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.349517  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:28.349523  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:28.349579  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:28.378156  420062 cri.go:89] found id: ""
	I1217 20:33:28.378170  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.378177  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:28.378185  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:28.378194  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:28.434254  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:28.434274  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:28.448810  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:28.448837  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:28.521268  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:28.512905   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.513656   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515366   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515890   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.517404   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:28.512905   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.513656   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515366   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515890   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.517404   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:28.521279  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:28.521290  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:28.584201  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:28.584222  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:31.112699  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:31.123315  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:31.123377  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:31.151761  420062 cri.go:89] found id: ""
	I1217 20:33:31.151776  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.151783  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:31.151789  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:31.151849  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:31.177165  420062 cri.go:89] found id: ""
	I1217 20:33:31.177178  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.177186  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:31.177191  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:31.177262  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:31.205229  420062 cri.go:89] found id: ""
	I1217 20:33:31.205260  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.205267  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:31.205272  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:31.205341  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:31.229570  420062 cri.go:89] found id: ""
	I1217 20:33:31.229584  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.229591  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:31.229597  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:31.229673  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:31.258880  420062 cri.go:89] found id: ""
	I1217 20:33:31.258904  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.258911  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:31.258917  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:31.258983  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:31.286222  420062 cri.go:89] found id: ""
	I1217 20:33:31.286241  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.286248  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:31.286253  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:31.286315  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:31.311291  420062 cri.go:89] found id: ""
	I1217 20:33:31.311314  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.311322  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:31.311330  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:31.311340  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:31.342524  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:31.342541  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:31.398421  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:31.398440  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:31.413476  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:31.413497  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:31.478376  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:31.469734   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.470537   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472118   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472657   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.474358   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:31.469734   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.470537   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472118   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472657   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.474358   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:31.478388  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:31.478398  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:34.044394  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:34.054571  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:34.054632  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:34.078791  420062 cri.go:89] found id: ""
	I1217 20:33:34.078815  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.078822  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:34.078827  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:34.078902  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:34.103484  420062 cri.go:89] found id: ""
	I1217 20:33:34.103498  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.103505  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:34.103510  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:34.103578  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:34.128330  420062 cri.go:89] found id: ""
	I1217 20:33:34.128343  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.128362  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:34.128368  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:34.128436  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:34.156115  420062 cri.go:89] found id: ""
	I1217 20:33:34.156129  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.156136  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:34.156141  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:34.156208  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:34.179862  420062 cri.go:89] found id: ""
	I1217 20:33:34.179876  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.179884  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:34.179889  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:34.179959  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:34.205717  420062 cri.go:89] found id: ""
	I1217 20:33:34.205731  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.205739  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:34.205745  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:34.205804  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:34.230674  420062 cri.go:89] found id: ""
	I1217 20:33:34.230689  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.230702  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:34.230710  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:34.230720  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:34.286930  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:34.286949  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:34.301786  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:34.301803  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:34.365439  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:34.357724   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.358190   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.359660   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.360034   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.361429   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:34.357724   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.358190   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.359660   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.360034   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.361429   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:34.365461  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:34.365473  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:34.426703  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:34.426724  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:36.954941  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:36.964889  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:36.964949  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:37.000981  420062 cri.go:89] found id: ""
	I1217 20:33:37.000999  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.001008  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:37.001014  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:37.001098  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:37.036987  420062 cri.go:89] found id: ""
	I1217 20:33:37.037001  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.037008  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:37.037013  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:37.037083  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:37.067078  420062 cri.go:89] found id: ""
	I1217 20:33:37.067092  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.067099  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:37.067105  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:37.067173  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:37.101494  420062 cri.go:89] found id: ""
	I1217 20:33:37.101509  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.101516  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:37.101522  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:37.101582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:37.125577  420062 cri.go:89] found id: ""
	I1217 20:33:37.125591  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.125599  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:37.125604  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:37.125672  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:37.155006  420062 cri.go:89] found id: ""
	I1217 20:33:37.155022  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.155040  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:37.155045  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:37.155105  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:37.180061  420062 cri.go:89] found id: ""
	I1217 20:33:37.180075  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.180082  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:37.180090  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:37.180110  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:37.235716  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:37.235744  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:37.250676  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:37.250704  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:37.314789  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:37.307219   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.307729   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309210   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309555   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.311019   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:37.307219   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.307729   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309210   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309555   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.311019   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:37.314799  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:37.314811  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:37.376546  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:37.376566  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:39.904036  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:39.914146  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:39.914209  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:39.942353  420062 cri.go:89] found id: ""
	I1217 20:33:39.942366  420062 logs.go:282] 0 containers: []
	W1217 20:33:39.942374  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:39.942379  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:39.942445  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:39.970090  420062 cri.go:89] found id: ""
	I1217 20:33:39.970105  420062 logs.go:282] 0 containers: []
	W1217 20:33:39.970113  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:39.970119  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:39.970185  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:40.013204  420062 cri.go:89] found id: ""
	I1217 20:33:40.013220  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.013228  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:40.013234  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:40.013312  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:40.055438  420062 cri.go:89] found id: ""
	I1217 20:33:40.055453  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.055461  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:40.055467  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:40.055532  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:40.088240  420062 cri.go:89] found id: ""
	I1217 20:33:40.088285  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.088293  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:40.088298  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:40.088361  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:40.116666  420062 cri.go:89] found id: ""
	I1217 20:33:40.116680  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.116687  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:40.116693  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:40.116752  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:40.143935  420062 cri.go:89] found id: ""
	I1217 20:33:40.143951  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.143965  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:40.143973  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:40.143986  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:40.199464  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:40.199484  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:40.214665  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:40.214682  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:40.285603  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:40.277391   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.277927   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.279526   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.280079   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.281668   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:40.277391   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.277927   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.279526   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.280079   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.281668   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:40.285613  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:40.285623  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:40.348551  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:40.348571  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:42.882366  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:42.892346  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:42.892407  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:42.917526  420062 cri.go:89] found id: ""
	I1217 20:33:42.917540  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.917548  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:42.917553  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:42.917622  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:42.941649  420062 cri.go:89] found id: ""
	I1217 20:33:42.941663  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.941670  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:42.941675  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:42.941737  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:42.965314  420062 cri.go:89] found id: ""
	I1217 20:33:42.965328  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.965335  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:42.965341  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:42.965399  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:42.992861  420062 cri.go:89] found id: ""
	I1217 20:33:42.992875  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.992882  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:42.992888  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:42.992949  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:43.026962  420062 cri.go:89] found id: ""
	I1217 20:33:43.026977  420062 logs.go:282] 0 containers: []
	W1217 20:33:43.026984  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:43.026989  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:43.027048  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:43.056268  420062 cri.go:89] found id: ""
	I1217 20:33:43.056282  420062 logs.go:282] 0 containers: []
	W1217 20:33:43.056289  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:43.056295  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:43.056353  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:43.088527  420062 cri.go:89] found id: ""
	I1217 20:33:43.088542  420062 logs.go:282] 0 containers: []
	W1217 20:33:43.088549  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:43.088556  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:43.088567  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:43.115028  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:43.115044  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:43.170239  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:43.170258  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:43.185453  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:43.185468  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:43.255155  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:43.247293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.247760   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249636   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.251132   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:43.247293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.247760   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249636   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.251132   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:43.255166  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:43.255176  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:45.818750  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:45.829020  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:45.829084  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:45.854296  420062 cri.go:89] found id: ""
	I1217 20:33:45.854310  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.854319  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:45.854327  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:45.854393  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:45.884706  420062 cri.go:89] found id: ""
	I1217 20:33:45.884720  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.884728  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:45.884733  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:45.884795  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:45.909518  420062 cri.go:89] found id: ""
	I1217 20:33:45.909533  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.909540  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:45.909545  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:45.909615  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:45.935050  420062 cri.go:89] found id: ""
	I1217 20:33:45.935065  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.935073  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:45.935078  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:45.935155  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:45.964622  420062 cri.go:89] found id: ""
	I1217 20:33:45.964636  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.964643  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:45.964648  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:45.964714  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:45.992340  420062 cri.go:89] found id: ""
	I1217 20:33:45.992355  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.992363  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:45.992368  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:45.992432  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:46.029800  420062 cri.go:89] found id: ""
	I1217 20:33:46.029815  420062 logs.go:282] 0 containers: []
	W1217 20:33:46.029822  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:46.029841  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:46.029852  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:46.096203  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:46.096224  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:46.111499  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:46.111517  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:46.174259  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:46.165992   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.166754   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168484   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168848   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.170379   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:46.165992   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.166754   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168484   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168848   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.170379   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:46.174269  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:46.174282  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:46.239891  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:46.239911  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:48.769726  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:48.779731  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:48.779796  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:48.803697  420062 cri.go:89] found id: ""
	I1217 20:33:48.803710  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.803718  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:48.803723  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:48.803790  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:48.828947  420062 cri.go:89] found id: ""
	I1217 20:33:48.828966  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.828974  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:48.828979  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:48.829045  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:48.853794  420062 cri.go:89] found id: ""
	I1217 20:33:48.853809  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.853815  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:48.853821  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:48.853884  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:48.879220  420062 cri.go:89] found id: ""
	I1217 20:33:48.879234  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.879241  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:48.879253  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:48.879316  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:48.905546  420062 cri.go:89] found id: ""
	I1217 20:33:48.905560  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.905567  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:48.905573  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:48.905639  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:48.931025  420062 cri.go:89] found id: ""
	I1217 20:33:48.931040  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.931047  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:48.931053  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:48.931111  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:48.959554  420062 cri.go:89] found id: ""
	I1217 20:33:48.959567  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.959575  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:48.959591  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:48.959603  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:49.037548  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:49.028333   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.029097   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.030655   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.031218   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.033613   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:49.028333   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.029097   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.030655   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.031218   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.033613   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:49.037558  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:49.037576  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:49.104606  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:49.104628  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:49.132120  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:49.132142  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:49.189781  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:49.189799  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:51.705313  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:51.715310  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:51.715375  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:51.742788  420062 cri.go:89] found id: ""
	I1217 20:33:51.742803  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.742810  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:51.742816  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:51.742878  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:51.768132  420062 cri.go:89] found id: ""
	I1217 20:33:51.768147  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.768154  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:51.768160  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:51.768220  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:51.796803  420062 cri.go:89] found id: ""
	I1217 20:33:51.796817  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.796825  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:51.796831  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:51.796891  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:51.823032  420062 cri.go:89] found id: ""
	I1217 20:33:51.823046  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.823054  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:51.823061  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:51.823122  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:51.848750  420062 cri.go:89] found id: ""
	I1217 20:33:51.848765  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.848773  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:51.848778  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:51.848840  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:51.874494  420062 cri.go:89] found id: ""
	I1217 20:33:51.874509  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.874516  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:51.874522  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:51.874582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:51.912240  420062 cri.go:89] found id: ""
	I1217 20:33:51.912273  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.912281  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:51.912290  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:51.912301  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:51.940881  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:51.940897  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:51.997574  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:51.997596  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:52.016000  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:52.016018  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:52.093264  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:52.084701   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.085399   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087055   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087666   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.089311   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:52.084701   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.085399   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087055   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087666   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.089311   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:52.093274  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:52.093286  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:54.657449  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:54.667679  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:54.667741  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:54.696106  420062 cri.go:89] found id: ""
	I1217 20:33:54.696121  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.696128  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:54.696133  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:54.696194  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:54.720578  420062 cri.go:89] found id: ""
	I1217 20:33:54.720592  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.720599  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:54.720605  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:54.720669  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:54.746036  420062 cri.go:89] found id: ""
	I1217 20:33:54.746050  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.746058  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:54.746063  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:54.746122  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:54.770192  420062 cri.go:89] found id: ""
	I1217 20:33:54.770206  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.770213  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:54.770219  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:54.770275  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:54.794365  420062 cri.go:89] found id: ""
	I1217 20:33:54.794379  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.794386  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:54.794391  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:54.794454  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:54.818424  420062 cri.go:89] found id: ""
	I1217 20:33:54.818438  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.818446  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:54.818451  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:54.818513  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:54.843360  420062 cri.go:89] found id: ""
	I1217 20:33:54.843375  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.843382  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:54.843401  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:54.843412  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:54.872684  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:54.872701  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:54.928831  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:54.928851  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:54.943545  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:54.943561  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:55.020697  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:55.008146   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.009058   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.010012   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011180   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011994   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:55.008146   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.009058   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.010012   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011180   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011994   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:55.020721  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:55.020734  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:57.590507  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:57.600840  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:57.600911  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:57.628650  420062 cri.go:89] found id: ""
	I1217 20:33:57.628664  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.628671  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:57.628676  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:57.628736  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:57.653915  420062 cri.go:89] found id: ""
	I1217 20:33:57.653929  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.653936  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:57.653941  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:57.654005  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:57.677881  420062 cri.go:89] found id: ""
	I1217 20:33:57.677894  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.677901  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:57.677906  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:57.677974  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:57.701808  420062 cri.go:89] found id: ""
	I1217 20:33:57.701823  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.701830  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:57.701836  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:57.701894  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:57.725682  420062 cri.go:89] found id: ""
	I1217 20:33:57.725696  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.725703  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:57.725708  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:57.725770  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:57.753864  420062 cri.go:89] found id: ""
	I1217 20:33:57.753878  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.753885  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:57.753891  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:57.753948  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:57.779180  420062 cri.go:89] found id: ""
	I1217 20:33:57.779193  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.779200  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:57.779216  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:57.779227  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:57.834554  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:57.834575  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:57.849468  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:57.849484  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:57.917796  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:57.910011   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.910781   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912353   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912882   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.913951   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:57.910011   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.910781   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912353   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912882   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.913951   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:57.917816  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:57.917827  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:57.980535  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:57.980556  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:00.519246  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:00.531028  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:00.531090  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:00.557919  420062 cri.go:89] found id: ""
	I1217 20:34:00.557933  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.557941  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:00.557947  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:00.558006  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:00.583357  420062 cri.go:89] found id: ""
	I1217 20:34:00.583381  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.583389  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:00.583394  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:00.583461  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:00.608300  420062 cri.go:89] found id: ""
	I1217 20:34:00.608313  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.608321  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:00.608326  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:00.608396  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:00.633249  420062 cri.go:89] found id: ""
	I1217 20:34:00.633263  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.633271  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:00.633277  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:00.633354  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:00.657998  420062 cri.go:89] found id: ""
	I1217 20:34:00.658012  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.658020  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:00.658025  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:00.658083  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:00.686479  420062 cri.go:89] found id: ""
	I1217 20:34:00.686494  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.686502  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:00.686517  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:00.686600  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:00.715237  420062 cri.go:89] found id: ""
	I1217 20:34:00.715251  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.715259  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:00.715281  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:00.715297  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:00.771736  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:00.771756  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:00.786569  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:00.786584  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:00.855532  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:00.846820   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.847617   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849290   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849821   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.851435   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:00.846820   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.847617   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849290   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849821   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.851435   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:00.855544  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:00.855556  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:00.929889  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:00.929917  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:03.457778  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:03.467767  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:03.467830  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:03.491745  420062 cri.go:89] found id: ""
	I1217 20:34:03.491760  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.491767  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:03.491772  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:03.491834  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:03.516486  420062 cri.go:89] found id: ""
	I1217 20:34:03.516501  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.516508  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:03.516514  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:03.516573  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:03.545504  420062 cri.go:89] found id: ""
	I1217 20:34:03.545518  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.545526  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:03.545531  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:03.545592  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:03.570752  420062 cri.go:89] found id: ""
	I1217 20:34:03.570766  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.570773  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:03.570779  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:03.570837  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:03.599464  420062 cri.go:89] found id: ""
	I1217 20:34:03.599478  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.599486  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:03.599491  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:03.599551  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:03.626193  420062 cri.go:89] found id: ""
	I1217 20:34:03.626209  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.626217  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:03.626222  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:03.626280  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:03.650682  420062 cri.go:89] found id: ""
	I1217 20:34:03.650696  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.650704  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:03.650712  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:03.650724  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:03.712614  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:03.705244   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.705869   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.706805   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.707331   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.708827   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:03.705244   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.705869   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.706805   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.707331   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.708827   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:03.712625  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:03.712636  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:03.775226  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:03.775247  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:03.801581  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:03.801600  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:03.857991  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:03.858013  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:06.373018  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:06.382912  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:06.382972  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:06.408596  420062 cri.go:89] found id: ""
	I1217 20:34:06.408610  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.408617  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:06.408622  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:06.408681  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:06.437062  420062 cri.go:89] found id: ""
	I1217 20:34:06.437076  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.437083  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:06.437088  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:06.437149  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:06.463109  420062 cri.go:89] found id: ""
	I1217 20:34:06.463123  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.463130  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:06.463135  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:06.463198  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:06.487450  420062 cri.go:89] found id: ""
	I1217 20:34:06.487463  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.487470  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:06.487476  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:06.487537  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:06.512848  420062 cri.go:89] found id: ""
	I1217 20:34:06.512863  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.512870  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:06.512876  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:06.512939  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:06.536984  420062 cri.go:89] found id: ""
	I1217 20:34:06.536998  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.537006  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:06.537011  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:06.537069  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:06.565689  420062 cri.go:89] found id: ""
	I1217 20:34:06.565732  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.565740  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:06.565748  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:06.565758  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:06.626274  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:06.626294  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:06.641612  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:06.641630  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:06.703082  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:06.694717   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.695365   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697091   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697739   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.699357   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:06.694717   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.695365   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697091   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697739   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.699357   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:06.703092  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:06.703104  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:06.768202  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:06.768221  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:09.296397  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:09.306558  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:09.306619  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:09.330814  420062 cri.go:89] found id: ""
	I1217 20:34:09.330828  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.330836  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:09.330841  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:09.330900  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:09.360228  420062 cri.go:89] found id: ""
	I1217 20:34:09.360242  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.360270  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:09.360276  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:09.360336  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:09.383852  420062 cri.go:89] found id: ""
	I1217 20:34:09.383865  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.383871  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:09.383876  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:09.383933  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:09.408740  420062 cri.go:89] found id: ""
	I1217 20:34:09.408753  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.408760  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:09.408765  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:09.408824  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:09.433879  420062 cri.go:89] found id: ""
	I1217 20:34:09.433894  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.433901  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:09.433907  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:09.433965  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:09.458138  420062 cri.go:89] found id: ""
	I1217 20:34:09.458152  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.458160  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:09.458165  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:09.458223  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:09.482170  420062 cri.go:89] found id: ""
	I1217 20:34:09.482184  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.482191  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:09.482199  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:09.482214  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:09.539809  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:09.539831  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:09.555108  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:09.555124  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:09.617755  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:09.608834   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.609449   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611182   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611721   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.613344   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:09.608834   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.609449   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611182   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611721   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.613344   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:09.617779  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:09.617790  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:09.680900  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:09.680920  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:12.217262  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:12.227378  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:12.227441  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:12.260904  420062 cri.go:89] found id: ""
	I1217 20:34:12.260918  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.260926  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:12.260931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:12.260991  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:12.290600  420062 cri.go:89] found id: ""
	I1217 20:34:12.290614  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.290621  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:12.290626  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:12.290694  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:12.317694  420062 cri.go:89] found id: ""
	I1217 20:34:12.317708  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.317716  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:12.317721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:12.317789  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:12.347280  420062 cri.go:89] found id: ""
	I1217 20:34:12.347300  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.347308  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:12.347323  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:12.347382  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:12.375032  420062 cri.go:89] found id: ""
	I1217 20:34:12.375046  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.375054  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:12.375060  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:12.375121  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:12.400749  420062 cri.go:89] found id: ""
	I1217 20:34:12.400763  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.400771  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:12.400779  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:12.400837  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:12.425915  420062 cri.go:89] found id: ""
	I1217 20:34:12.425929  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.425937  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:12.425946  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:12.425957  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:12.486250  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:12.486269  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:12.501500  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:12.501515  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:12.571896  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:12.563218   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564136   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564679   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566300   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566801   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:12.563218   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564136   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564679   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566300   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566801   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:12.571906  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:12.571921  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:12.635853  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:12.635876  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:15.166604  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:15.177581  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:15.177645  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:15.201800  420062 cri.go:89] found id: ""
	I1217 20:34:15.201815  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.201822  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:15.201828  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:15.201892  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:15.229609  420062 cri.go:89] found id: ""
	I1217 20:34:15.229624  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.229631  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:15.229636  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:15.229703  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:15.257583  420062 cri.go:89] found id: ""
	I1217 20:34:15.257597  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.257605  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:15.257610  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:15.257673  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:15.291085  420062 cri.go:89] found id: ""
	I1217 20:34:15.291099  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.291106  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:15.291112  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:15.291190  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:15.324198  420062 cri.go:89] found id: ""
	I1217 20:34:15.324212  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.324219  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:15.324226  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:15.324317  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:15.348977  420062 cri.go:89] found id: ""
	I1217 20:34:15.348991  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.348998  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:15.349004  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:15.349069  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:15.373132  420062 cri.go:89] found id: ""
	I1217 20:34:15.373147  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.373155  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:15.373162  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:15.373174  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:15.387711  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:15.387728  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:15.453164  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:15.443181   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.443915   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.445657   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.447470   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.448047   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:15.443181   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.443915   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.445657   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.447470   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.448047   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:15.453175  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:15.453187  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:15.519197  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:15.519219  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:15.547781  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:15.547799  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:18.106475  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:18.117557  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:18.117619  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:18.142233  420062 cri.go:89] found id: ""
	I1217 20:34:18.142246  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.142253  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:18.142258  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:18.142319  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:18.166913  420062 cri.go:89] found id: ""
	I1217 20:34:18.166927  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.166934  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:18.166940  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:18.167002  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:18.195856  420062 cri.go:89] found id: ""
	I1217 20:34:18.195870  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.195877  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:18.195883  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:18.195944  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:18.222291  420062 cri.go:89] found id: ""
	I1217 20:34:18.222306  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.222313  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:18.222318  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:18.222382  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:18.254911  420062 cri.go:89] found id: ""
	I1217 20:34:18.254925  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.254932  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:18.254937  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:18.254996  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:18.299082  420062 cri.go:89] found id: ""
	I1217 20:34:18.299096  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.299103  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:18.299109  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:18.299173  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:18.323848  420062 cri.go:89] found id: ""
	I1217 20:34:18.323862  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.323869  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:18.323877  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:18.323888  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:18.381056  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:18.381082  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:18.395602  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:18.395617  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:18.459223  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:18.450909   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.451543   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453107   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453711   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.455276   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:18.450909   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.451543   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453107   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453711   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.455276   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:18.459233  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:18.459244  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:18.522287  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:18.522307  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:21.051832  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:21.062206  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:21.062275  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:21.090124  420062 cri.go:89] found id: ""
	I1217 20:34:21.090139  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.090146  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:21.090151  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:21.090211  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:21.114268  420062 cri.go:89] found id: ""
	I1217 20:34:21.114282  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.114289  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:21.114294  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:21.114357  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:21.141585  420062 cri.go:89] found id: ""
	I1217 20:34:21.141599  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.141606  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:21.141611  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:21.141673  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:21.167173  420062 cri.go:89] found id: ""
	I1217 20:34:21.167187  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.167195  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:21.167200  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:21.167277  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:21.191543  420062 cri.go:89] found id: ""
	I1217 20:34:21.191557  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.191564  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:21.191569  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:21.191640  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:21.219365  420062 cri.go:89] found id: ""
	I1217 20:34:21.219378  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.219385  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:21.219390  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:21.219451  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:21.256303  420062 cri.go:89] found id: ""
	I1217 20:34:21.256317  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.256324  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:21.256332  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:21.256342  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:21.323014  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:21.323035  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:21.337647  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:21.337664  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:21.400131  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:21.391524   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.392409   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394006   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394305   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.395921   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:21.391524   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.392409   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394006   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394305   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.395921   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:21.400140  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:21.400151  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:21.467704  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:21.467725  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:23.996278  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:24.008421  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:24.008487  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:24.035322  420062 cri.go:89] found id: ""
	I1217 20:34:24.035336  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.035344  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:24.035349  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:24.035413  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:24.060026  420062 cri.go:89] found id: ""
	I1217 20:34:24.060040  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.060048  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:24.060054  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:24.060131  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:24.085236  420062 cri.go:89] found id: ""
	I1217 20:34:24.085250  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.085257  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:24.085263  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:24.085323  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:24.110730  420062 cri.go:89] found id: ""
	I1217 20:34:24.110763  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.110772  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:24.110778  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:24.110851  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:24.138006  420062 cri.go:89] found id: ""
	I1217 20:34:24.138020  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.138028  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:24.138034  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:24.138094  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:24.168065  420062 cri.go:89] found id: ""
	I1217 20:34:24.168080  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.168094  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:24.168100  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:24.168172  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:24.193244  420062 cri.go:89] found id: ""
	I1217 20:34:24.193258  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.193265  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:24.193273  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:24.193284  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:24.260181  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:24.260201  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:24.299429  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:24.299446  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:24.355633  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:24.355653  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:24.371493  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:24.371508  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:24.439767  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:24.431274   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.432160   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.433728   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.434387   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.435768   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:24.431274   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.432160   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.433728   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.434387   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.435768   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:26.940651  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:26.951081  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:26.951148  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:26.975583  420062 cri.go:89] found id: ""
	I1217 20:34:26.975598  420062 logs.go:282] 0 containers: []
	W1217 20:34:26.975606  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:26.975611  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:26.975671  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:27.003924  420062 cri.go:89] found id: ""
	I1217 20:34:27.003939  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.003948  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:27.003954  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:27.004018  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:27.029433  420062 cri.go:89] found id: ""
	I1217 20:34:27.029446  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.029454  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:27.029460  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:27.029520  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:27.055977  420062 cri.go:89] found id: ""
	I1217 20:34:27.055990  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.055998  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:27.056027  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:27.056093  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:27.081756  420062 cri.go:89] found id: ""
	I1217 20:34:27.081770  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.081777  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:27.081783  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:27.081846  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:27.106532  420062 cri.go:89] found id: ""
	I1217 20:34:27.106546  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.106554  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:27.106587  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:27.106651  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:27.131573  420062 cri.go:89] found id: ""
	I1217 20:34:27.131587  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.131595  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:27.131603  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:27.131613  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:27.194270  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:27.194290  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:27.222438  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:27.222453  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:27.284134  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:27.284154  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:27.300336  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:27.300352  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:27.369337  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:27.360889   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.361563   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363199   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363762   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.365407   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:27.360889   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.361563   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363199   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363762   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.365407   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:29.871004  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:29.881325  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:29.881389  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:29.906739  420062 cri.go:89] found id: ""
	I1217 20:34:29.906753  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.906760  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:29.906766  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:29.906828  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:29.935023  420062 cri.go:89] found id: ""
	I1217 20:34:29.935037  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.935045  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:29.935049  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:29.935110  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:29.968427  420062 cri.go:89] found id: ""
	I1217 20:34:29.968442  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.968449  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:29.968454  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:29.968514  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:29.993120  420062 cri.go:89] found id: ""
	I1217 20:34:29.993133  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.993141  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:29.993147  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:29.993208  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:30.038216  420062 cri.go:89] found id: ""
	I1217 20:34:30.038232  420062 logs.go:282] 0 containers: []
	W1217 20:34:30.038240  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:30.038256  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:30.038331  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:30.088044  420062 cri.go:89] found id: ""
	I1217 20:34:30.088059  420062 logs.go:282] 0 containers: []
	W1217 20:34:30.088067  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:30.088080  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:30.088145  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:30.116773  420062 cri.go:89] found id: ""
	I1217 20:34:30.116789  420062 logs.go:282] 0 containers: []
	W1217 20:34:30.116798  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:30.116808  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:30.116819  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:30.175618  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:30.175638  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:30.191950  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:30.191967  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:30.268938  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:30.259892   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.260676   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262229   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262537   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.263971   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:30.259892   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.260676   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262229   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262537   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.263971   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:30.268949  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:30.268960  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:30.345609  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:30.345631  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:32.873852  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:32.884009  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:32.884072  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:32.908673  420062 cri.go:89] found id: ""
	I1217 20:34:32.908688  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.908696  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:32.908701  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:32.908761  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:32.933101  420062 cri.go:89] found id: ""
	I1217 20:34:32.933115  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.933122  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:32.933127  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:32.933192  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:32.956968  420062 cri.go:89] found id: ""
	I1217 20:34:32.956982  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.956991  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:32.956996  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:32.957054  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:32.982228  420062 cri.go:89] found id: ""
	I1217 20:34:32.982241  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.982249  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:32.982254  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:32.982312  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:33.011791  420062 cri.go:89] found id: ""
	I1217 20:34:33.011805  420062 logs.go:282] 0 containers: []
	W1217 20:34:33.011812  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:33.011818  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:33.011885  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:33.038878  420062 cri.go:89] found id: ""
	I1217 20:34:33.038894  420062 logs.go:282] 0 containers: []
	W1217 20:34:33.038901  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:33.038907  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:33.038969  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:33.068421  420062 cri.go:89] found id: ""
	I1217 20:34:33.068436  420062 logs.go:282] 0 containers: []
	W1217 20:34:33.068443  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:33.068453  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:33.068463  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:33.083444  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:33.083461  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:33.147593  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:33.139067   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.139640   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141533   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141989   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.143520   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:33.139067   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.139640   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141533   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141989   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.143520   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:33.147604  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:33.147617  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:33.211005  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:33.211025  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:33.247311  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:33.247327  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:35.820692  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:35.830805  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:35.830879  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:35.855694  420062 cri.go:89] found id: ""
	I1217 20:34:35.855708  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.855716  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:35.855721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:35.855780  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:35.879932  420062 cri.go:89] found id: ""
	I1217 20:34:35.879947  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.879955  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:35.879960  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:35.880021  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:35.904606  420062 cri.go:89] found id: ""
	I1217 20:34:35.904622  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.904630  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:35.904635  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:35.904700  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:35.932655  420062 cri.go:89] found id: ""
	I1217 20:34:35.932669  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.932676  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:35.932681  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:35.932742  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:35.956665  420062 cri.go:89] found id: ""
	I1217 20:34:35.956679  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.956686  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:35.956691  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:35.956748  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:35.981363  420062 cri.go:89] found id: ""
	I1217 20:34:35.981377  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.981385  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:35.981391  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:35.981450  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:36.013052  420062 cri.go:89] found id: ""
	I1217 20:34:36.013068  420062 logs.go:282] 0 containers: []
	W1217 20:34:36.013076  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:36.013084  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:36.013097  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:36.080346  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:36.080367  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:36.109280  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:36.109296  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:36.168612  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:36.168630  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:36.183490  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:36.183505  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:36.254206  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:36.245334   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.246226   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.247937   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.248300   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.249802   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:36.245334   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.246226   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.247937   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.248300   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.249802   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:38.754461  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:38.764820  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:38.764885  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:38.790226  420062 cri.go:89] found id: ""
	I1217 20:34:38.790243  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.790251  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:38.790257  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:38.790317  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:38.815898  420062 cri.go:89] found id: ""
	I1217 20:34:38.815913  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.815920  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:38.815925  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:38.815986  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:38.840879  420062 cri.go:89] found id: ""
	I1217 20:34:38.840894  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.840901  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:38.840907  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:38.840967  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:38.865756  420062 cri.go:89] found id: ""
	I1217 20:34:38.865772  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.865780  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:38.865785  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:38.865851  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:38.893497  420062 cri.go:89] found id: ""
	I1217 20:34:38.893511  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.893518  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:38.893523  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:38.893582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:38.918737  420062 cri.go:89] found id: ""
	I1217 20:34:38.918751  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.918758  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:38.918763  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:38.918821  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:38.943126  420062 cri.go:89] found id: ""
	I1217 20:34:38.943140  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.943147  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:38.943155  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:38.943166  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:39.008933  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:38.999020   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.000025   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.001953   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.002737   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.004715   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:38.999020   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.000025   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.001953   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.002737   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.004715   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:39.008944  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:39.008955  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:39.071529  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:39.071550  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:39.098851  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:39.098866  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:39.157559  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:39.157578  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:41.673292  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:41.683569  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:41.683631  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:41.712444  420062 cri.go:89] found id: ""
	I1217 20:34:41.712458  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.712466  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:41.712471  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:41.712540  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:41.737230  420062 cri.go:89] found id: ""
	I1217 20:34:41.737244  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.737253  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:41.737258  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:41.737320  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:41.765904  420062 cri.go:89] found id: ""
	I1217 20:34:41.765918  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.765926  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:41.765931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:41.765993  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:41.790803  420062 cri.go:89] found id: ""
	I1217 20:34:41.790818  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.790826  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:41.790831  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:41.790891  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:41.816378  420062 cri.go:89] found id: ""
	I1217 20:34:41.816393  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.816399  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:41.816405  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:41.816465  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:41.846163  420062 cri.go:89] found id: ""
	I1217 20:34:41.846177  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.846184  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:41.846190  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:41.846249  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:41.874235  420062 cri.go:89] found id: ""
	I1217 20:34:41.874249  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.874257  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:41.874264  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:41.874278  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:41.930007  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:41.930025  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:41.944733  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:41.944748  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:42.015145  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:42.005958   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.007326   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.008416   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009480   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009948   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:42.005958   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.007326   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.008416   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009480   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009948   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:42.015157  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:42.015168  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:42.083018  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:42.083046  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:44.617783  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:44.627898  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:44.627959  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:44.654510  420062 cri.go:89] found id: ""
	I1217 20:34:44.654524  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.654531  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:44.654536  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:44.654600  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:44.681532  420062 cri.go:89] found id: ""
	I1217 20:34:44.681547  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.681554  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:44.681560  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:44.681620  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:44.705927  420062 cri.go:89] found id: ""
	I1217 20:34:44.705941  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.705948  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:44.705953  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:44.706010  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:44.730835  420062 cri.go:89] found id: ""
	I1217 20:34:44.730849  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.730857  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:44.730862  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:44.730925  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:44.754987  420062 cri.go:89] found id: ""
	I1217 20:34:44.755002  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.755009  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:44.755014  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:44.755074  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:44.778787  420062 cri.go:89] found id: ""
	I1217 20:34:44.778801  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.778808  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:44.778814  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:44.778874  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:44.804370  420062 cri.go:89] found id: ""
	I1217 20:34:44.804385  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.804392  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:44.804401  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:44.804411  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:44.870852  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:44.870872  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:44.901529  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:44.901545  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:44.961405  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:44.961428  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:44.976411  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:44.976427  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:45.055180  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:45.045055   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.046486   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.047127   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.048790   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.049451   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:45.045055   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.046486   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.047127   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.048790   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.049451   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:47.555437  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:47.565320  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:47.565380  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:47.594473  420062 cri.go:89] found id: ""
	I1217 20:34:47.594488  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.594495  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:47.594500  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:47.594560  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:47.618819  420062 cri.go:89] found id: ""
	I1217 20:34:47.618833  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.618840  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:47.618845  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:47.618906  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:47.643299  420062 cri.go:89] found id: ""
	I1217 20:34:47.643313  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.643320  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:47.643325  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:47.643386  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:47.668500  420062 cri.go:89] found id: ""
	I1217 20:34:47.668514  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.668522  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:47.668527  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:47.668588  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:47.694650  420062 cri.go:89] found id: ""
	I1217 20:34:47.694671  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.694678  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:47.694683  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:47.694745  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:47.729169  420062 cri.go:89] found id: ""
	I1217 20:34:47.729183  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.729192  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:47.729197  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:47.729258  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:47.753481  420062 cri.go:89] found id: ""
	I1217 20:34:47.753494  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.753501  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:47.753509  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:47.753521  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:47.768175  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:47.768192  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:47.832224  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:47.823643   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.824432   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826211   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826814   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.828509   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:47.823643   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.824432   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826211   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826814   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.828509   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:47.832234  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:47.832264  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:47.894275  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:47.894294  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:47.921621  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:47.921638  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:50.477347  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:50.487837  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:50.487905  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:50.515440  420062 cri.go:89] found id: ""
	I1217 20:34:50.515460  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.515468  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:50.515473  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:50.515545  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:50.542521  420062 cri.go:89] found id: ""
	I1217 20:34:50.542546  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.542553  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:50.542559  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:50.542629  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:50.569586  420062 cri.go:89] found id: ""
	I1217 20:34:50.569600  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.569613  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:50.569618  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:50.569677  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:50.597938  420062 cri.go:89] found id: ""
	I1217 20:34:50.597951  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.597958  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:50.597966  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:50.598024  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:50.627019  420062 cri.go:89] found id: ""
	I1217 20:34:50.627044  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.627052  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:50.627057  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:50.627128  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:50.655921  420062 cri.go:89] found id: ""
	I1217 20:34:50.655948  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.655956  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:50.655962  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:50.656028  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:50.680457  420062 cri.go:89] found id: ""
	I1217 20:34:50.680471  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.680479  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:50.680487  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:50.680502  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:50.742350  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:50.734040   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.734460   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736277   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736697   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.738252   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:50.734040   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.734460   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736277   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736697   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.738252   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:50.742360  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:50.742370  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:50.802977  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:50.802997  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:50.830354  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:50.830370  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:50.887850  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:50.887869  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:53.403065  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:53.413162  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:53.413227  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:53.437500  420062 cri.go:89] found id: ""
	I1217 20:34:53.437513  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.437521  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:53.437526  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:53.437592  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:53.462889  420062 cri.go:89] found id: ""
	I1217 20:34:53.462902  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.462910  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:53.462915  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:53.462972  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:53.493212  420062 cri.go:89] found id: ""
	I1217 20:34:53.493226  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.493234  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:53.493239  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:53.493301  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:53.521829  420062 cri.go:89] found id: ""
	I1217 20:34:53.521844  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.521851  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:53.521857  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:53.521919  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:53.558427  420062 cri.go:89] found id: ""
	I1217 20:34:53.558442  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.558449  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:53.558454  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:53.558513  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:53.583439  420062 cri.go:89] found id: ""
	I1217 20:34:53.583453  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.583460  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:53.583466  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:53.583526  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:53.608693  420062 cri.go:89] found id: ""
	I1217 20:34:53.608707  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.608714  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:53.608722  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:53.608732  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:53.664959  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:53.664980  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:53.679865  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:53.679886  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:53.742568  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:53.733840   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.734623   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736275   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736848   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.738561   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:53.733840   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.734623   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736275   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736848   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.738561   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:53.742579  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:53.742591  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:53.803297  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:53.803317  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:56.335304  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:56.344915  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:56.344977  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:56.368289  420062 cri.go:89] found id: ""
	I1217 20:34:56.368304  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.368312  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:56.368319  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:56.368388  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:56.392693  420062 cri.go:89] found id: ""
	I1217 20:34:56.392707  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.392715  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:56.392721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:56.392782  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:56.419795  420062 cri.go:89] found id: ""
	I1217 20:34:56.419809  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.419825  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:56.419834  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:56.419902  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:56.445038  420062 cri.go:89] found id: ""
	I1217 20:34:56.445052  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.445060  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:56.445065  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:56.445128  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:56.474272  420062 cri.go:89] found id: ""
	I1217 20:34:56.474287  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.474294  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:56.474300  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:56.474366  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:56.507935  420062 cri.go:89] found id: ""
	I1217 20:34:56.507950  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.507957  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:56.507963  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:56.508030  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:56.535999  420062 cri.go:89] found id: ""
	I1217 20:34:56.536012  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.536030  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:56.536039  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:56.536050  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:56.572020  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:56.572037  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:56.628661  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:56.628681  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:56.643833  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:56.643856  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:56.710351  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:56.701895   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.702686   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704396   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704960   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.706438   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:56.701895   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.702686   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704396   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704960   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.706438   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:56.710361  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:56.710380  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:59.273579  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:59.283581  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:59.283645  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:59.309480  420062 cri.go:89] found id: ""
	I1217 20:34:59.309493  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.309500  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:59.309506  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:59.309564  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:59.333365  420062 cri.go:89] found id: ""
	I1217 20:34:59.333378  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.333386  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:59.333391  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:59.333452  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:59.357207  420062 cri.go:89] found id: ""
	I1217 20:34:59.357221  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.357228  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:59.357233  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:59.357298  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:59.381758  420062 cri.go:89] found id: ""
	I1217 20:34:59.381772  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.381781  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:59.381787  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:59.381845  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:59.406750  420062 cri.go:89] found id: ""
	I1217 20:34:59.406764  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.406772  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:59.406777  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:59.406845  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:59.431825  420062 cri.go:89] found id: ""
	I1217 20:34:59.431838  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.431846  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:59.431852  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:59.431913  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:59.458993  420062 cri.go:89] found id: ""
	I1217 20:34:59.459007  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.459014  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:59.459022  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:59.459041  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:59.546381  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:59.527767   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.528143   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.536500   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.537248   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.538811   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:59.527767   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.528143   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.536500   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.537248   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.538811   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:59.546391  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:59.546401  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:59.613987  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:59.614007  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:59.644296  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:59.644311  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:59.703226  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:59.703245  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:02.218783  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:02.229042  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:02.229114  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:02.254286  420062 cri.go:89] found id: ""
	I1217 20:35:02.254300  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.254308  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:02.254315  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:02.254374  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:02.281092  420062 cri.go:89] found id: ""
	I1217 20:35:02.281106  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.281114  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:02.281120  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:02.281198  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:02.310195  420062 cri.go:89] found id: ""
	I1217 20:35:02.310209  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.310217  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:02.310222  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:02.310294  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:02.338807  420062 cri.go:89] found id: ""
	I1217 20:35:02.338821  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.338829  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:02.338834  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:02.338904  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:02.364604  420062 cri.go:89] found id: ""
	I1217 20:35:02.364618  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.364625  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:02.364631  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:02.364693  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:02.389458  420062 cri.go:89] found id: ""
	I1217 20:35:02.389473  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.389481  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:02.389486  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:02.389544  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:02.419120  420062 cri.go:89] found id: ""
	I1217 20:35:02.419134  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.419142  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:02.419151  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:02.419162  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:02.476620  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:02.476640  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:02.492411  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:02.492428  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:02.567285  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:02.558682   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.559341   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.560957   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.561461   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.562999   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:02.558682   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.559341   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.560957   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.561461   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.562999   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:02.567294  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:02.567308  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:02.635002  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:02.635022  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:05.163567  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:05.174184  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:05.174245  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:05.199116  420062 cri.go:89] found id: ""
	I1217 20:35:05.199130  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.199137  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:05.199143  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:05.199206  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:05.223477  420062 cri.go:89] found id: ""
	I1217 20:35:05.223491  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.223498  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:05.223504  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:05.223562  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:05.247303  420062 cri.go:89] found id: ""
	I1217 20:35:05.247317  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.247325  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:05.247332  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:05.247391  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:05.272620  420062 cri.go:89] found id: ""
	I1217 20:35:05.272633  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.272641  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:05.272646  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:05.272703  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:05.300419  420062 cri.go:89] found id: ""
	I1217 20:35:05.300434  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.300441  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:05.300446  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:05.300505  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:05.325851  420062 cri.go:89] found id: ""
	I1217 20:35:05.325866  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.325873  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:05.325879  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:05.325938  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:05.354430  420062 cri.go:89] found id: ""
	I1217 20:35:05.354445  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.354452  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:05.354460  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:05.354475  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:05.369668  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:05.369686  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:05.436390  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:05.427472   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.428087   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.429823   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.430630   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.432463   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:05.427472   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.428087   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.429823   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.430630   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.432463   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:05.436400  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:05.436411  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:05.499177  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:05.499202  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:05.531231  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:05.531248  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:08.088375  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:08.098640  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:08.098711  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:08.132112  420062 cri.go:89] found id: ""
	I1217 20:35:08.132127  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.132136  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:08.132141  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:08.132205  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:08.157778  420062 cri.go:89] found id: ""
	I1217 20:35:08.157792  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.157800  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:08.157805  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:08.157862  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:08.183372  420062 cri.go:89] found id: ""
	I1217 20:35:08.183386  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.183393  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:08.183399  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:08.183457  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:08.208186  420062 cri.go:89] found id: ""
	I1217 20:35:08.208200  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.208207  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:08.208212  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:08.208310  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:08.236181  420062 cri.go:89] found id: ""
	I1217 20:35:08.236195  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.236202  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:08.236207  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:08.236313  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:08.261508  420062 cri.go:89] found id: ""
	I1217 20:35:08.261522  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.261529  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:08.261534  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:08.261593  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:08.286303  420062 cri.go:89] found id: ""
	I1217 20:35:08.286318  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.286325  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:08.286333  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:08.286349  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:08.345547  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:08.345573  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:08.360551  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:08.360568  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:08.424581  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:08.415775   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.416570   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.418257   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.419005   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.420742   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:08.415775   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.416570   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.418257   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.419005   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.420742   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:08.424593  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:08.424606  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:08.489146  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:08.489166  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:11.022570  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:11.034138  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:11.034205  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:11.066795  420062 cri.go:89] found id: ""
	I1217 20:35:11.066810  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.066817  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:11.066825  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:11.066888  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:11.092902  420062 cri.go:89] found id: ""
	I1217 20:35:11.092917  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.092925  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:11.092931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:11.092998  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:11.120040  420062 cri.go:89] found id: ""
	I1217 20:35:11.120056  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.120064  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:11.120069  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:11.120138  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:11.150096  420062 cri.go:89] found id: ""
	I1217 20:35:11.150111  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.150118  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:11.150124  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:11.150186  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:11.178952  420062 cri.go:89] found id: ""
	I1217 20:35:11.178966  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.178973  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:11.178979  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:11.179042  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:11.205194  420062 cri.go:89] found id: ""
	I1217 20:35:11.205208  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.205215  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:11.205221  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:11.205281  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:11.231314  420062 cri.go:89] found id: ""
	I1217 20:35:11.231327  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.231335  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:11.231343  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:11.231355  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:11.246458  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:11.246475  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:11.312684  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:11.304393   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.305171   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.306693   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.307058   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.308710   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:11.304393   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.305171   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.306693   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.307058   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.308710   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:11.312696  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:11.312706  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:11.379354  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:11.379374  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:11.413484  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:11.413500  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:13.972078  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:13.982223  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:13.982290  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:14.022488  420062 cri.go:89] found id: ""
	I1217 20:35:14.022502  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.022510  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:14.022515  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:14.022575  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:14.059328  420062 cri.go:89] found id: ""
	I1217 20:35:14.059342  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.059364  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:14.059369  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:14.059435  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:14.085531  420062 cri.go:89] found id: ""
	I1217 20:35:14.085544  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.085552  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:14.085558  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:14.085616  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:14.114113  420062 cri.go:89] found id: ""
	I1217 20:35:14.114134  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.114141  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:14.114147  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:14.114210  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:14.138505  420062 cri.go:89] found id: ""
	I1217 20:35:14.138519  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.138526  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:14.138532  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:14.138591  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:14.162838  420062 cri.go:89] found id: ""
	I1217 20:35:14.162852  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.162858  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:14.162863  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:14.162923  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:14.190631  420062 cri.go:89] found id: ""
	I1217 20:35:14.190651  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.190665  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:14.190672  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:14.190682  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:14.246544  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:14.246563  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:14.261703  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:14.261719  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:14.327698  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:14.319587   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.320376   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322035   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322354   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.323849   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:14.319587   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.320376   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322035   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322354   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.323849   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:14.327708  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:14.327721  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:14.391616  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:14.391635  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:16.921553  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:16.931542  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:16.931604  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:16.955206  420062 cri.go:89] found id: ""
	I1217 20:35:16.955220  420062 logs.go:282] 0 containers: []
	W1217 20:35:16.955227  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:16.955233  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:16.955291  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:16.984598  420062 cri.go:89] found id: ""
	I1217 20:35:16.984613  420062 logs.go:282] 0 containers: []
	W1217 20:35:16.984620  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:16.984625  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:16.984683  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:17.033712  420062 cri.go:89] found id: ""
	I1217 20:35:17.033726  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.033733  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:17.033739  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:17.033796  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:17.061936  420062 cri.go:89] found id: ""
	I1217 20:35:17.061950  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.061957  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:17.061963  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:17.062023  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:17.086921  420062 cri.go:89] found id: ""
	I1217 20:35:17.086936  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.086943  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:17.086948  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:17.087009  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:17.112474  420062 cri.go:89] found id: ""
	I1217 20:35:17.112488  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.112495  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:17.112501  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:17.112558  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:17.137847  420062 cri.go:89] found id: ""
	I1217 20:35:17.137867  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.137875  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:17.137882  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:17.137892  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:17.198885  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:17.198904  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:17.213637  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:17.213652  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:17.281467  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:17.272943   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.273684   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275273   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275893   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.277419   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:17.272943   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.273684   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275273   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275893   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.277419   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:17.281478  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:17.281488  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:17.343313  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:17.343334  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:19.871984  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:19.882066  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:19.882128  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:19.907664  420062 cri.go:89] found id: ""
	I1217 20:35:19.907678  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.907686  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:19.907691  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:19.907750  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:19.936014  420062 cri.go:89] found id: ""
	I1217 20:35:19.936028  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.936035  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:19.936040  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:19.936099  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:19.961865  420062 cri.go:89] found id: ""
	I1217 20:35:19.961881  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.961888  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:19.961893  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:19.961954  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:19.988749  420062 cri.go:89] found id: ""
	I1217 20:35:19.988762  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.988769  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:19.988775  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:19.988832  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:20.021844  420062 cri.go:89] found id: ""
	I1217 20:35:20.021859  420062 logs.go:282] 0 containers: []
	W1217 20:35:20.021866  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:20.021873  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:20.021936  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:20.064328  420062 cri.go:89] found id: ""
	I1217 20:35:20.064343  420062 logs.go:282] 0 containers: []
	W1217 20:35:20.064351  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:20.064356  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:20.064464  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:20.092230  420062 cri.go:89] found id: ""
	I1217 20:35:20.092244  420062 logs.go:282] 0 containers: []
	W1217 20:35:20.092272  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:20.092280  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:20.092291  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:20.150597  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:20.150617  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:20.166734  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:20.166751  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:20.235344  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:20.226511   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.227342   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.228855   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.229349   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.230876   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:20.226511   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.227342   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.228855   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.229349   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.230876   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:20.235354  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:20.235368  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:20.300971  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:20.300991  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:22.830503  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:22.840565  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:22.840627  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:22.865965  420062 cri.go:89] found id: ""
	I1217 20:35:22.865980  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.865987  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:22.865992  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:22.866051  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:22.890981  420062 cri.go:89] found id: ""
	I1217 20:35:22.890995  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.891002  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:22.891007  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:22.891067  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:22.916050  420062 cri.go:89] found id: ""
	I1217 20:35:22.916064  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.916070  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:22.916075  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:22.916134  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:22.940231  420062 cri.go:89] found id: ""
	I1217 20:35:22.940244  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.940274  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:22.940280  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:22.940338  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:22.964651  420062 cri.go:89] found id: ""
	I1217 20:35:22.964665  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.964673  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:22.964678  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:22.964739  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:22.999102  420062 cri.go:89] found id: ""
	I1217 20:35:22.999118  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.999126  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:22.999133  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:22.999201  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:23.031417  420062 cri.go:89] found id: ""
	I1217 20:35:23.031431  420062 logs.go:282] 0 containers: []
	W1217 20:35:23.031440  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:23.031447  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:23.031458  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:23.099279  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:23.099300  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:23.127896  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:23.127914  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:23.184706  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:23.184725  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:23.199879  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:23.199895  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:23.267184  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:23.258603   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.259294   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.260943   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.261532   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.263117   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:23.258603   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.259294   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.260943   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.261532   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.263117   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:25.768885  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:25.778947  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:25.779017  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:25.802991  420062 cri.go:89] found id: ""
	I1217 20:35:25.803005  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.803025  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:25.803031  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:25.803093  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:25.830724  420062 cri.go:89] found id: ""
	I1217 20:35:25.830738  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.830745  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:25.830751  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:25.830813  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:25.860059  420062 cri.go:89] found id: ""
	I1217 20:35:25.860073  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.860081  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:25.860085  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:25.860150  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:25.896087  420062 cri.go:89] found id: ""
	I1217 20:35:25.896101  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.896108  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:25.896114  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:25.896173  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:25.921891  420062 cri.go:89] found id: ""
	I1217 20:35:25.921905  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.921912  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:25.921918  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:25.921975  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:25.946115  420062 cri.go:89] found id: ""
	I1217 20:35:25.946129  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.946137  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:25.946142  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:25.946199  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:25.970696  420062 cri.go:89] found id: ""
	I1217 20:35:25.970711  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.970719  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:25.970727  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:25.970737  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:26.031476  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:26.031497  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:26.053026  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:26.053044  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:26.121268  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:26.112221   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.113175   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.114729   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.115229   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.116856   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:26.112221   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.113175   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.114729   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.115229   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.116856   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:26.121279  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:26.121290  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:26.183866  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:26.183888  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:28.713125  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:28.723373  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:28.723436  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:28.750204  420062 cri.go:89] found id: ""
	I1217 20:35:28.750218  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.750225  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:28.750231  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:28.750295  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:28.774507  420062 cri.go:89] found id: ""
	I1217 20:35:28.774520  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.774528  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:28.774533  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:28.774593  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:28.799202  420062 cri.go:89] found id: ""
	I1217 20:35:28.799217  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.799225  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:28.799230  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:28.799295  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:28.823894  420062 cri.go:89] found id: ""
	I1217 20:35:28.823908  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.823916  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:28.823921  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:28.823981  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:28.848696  420062 cri.go:89] found id: ""
	I1217 20:35:28.848710  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.848717  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:28.848722  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:28.848780  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:28.874108  420062 cri.go:89] found id: ""
	I1217 20:35:28.874121  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.874129  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:28.874146  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:28.874206  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:28.899607  420062 cri.go:89] found id: ""
	I1217 20:35:28.899621  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.899628  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:28.899636  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:28.899646  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:28.955990  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:28.956010  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:28.970828  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:28.970844  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:29.048596  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:29.039925   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.040773   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042371   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042731   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.044197   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:29.039925   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.040773   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042371   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042731   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.044197   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:29.048606  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:29.048627  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:29.115475  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:29.115495  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:31.644907  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:31.654819  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:31.654879  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:31.678281  420062 cri.go:89] found id: ""
	I1217 20:35:31.678295  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.678303  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:31.678308  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:31.678370  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:31.702902  420062 cri.go:89] found id: ""
	I1217 20:35:31.702916  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.702923  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:31.702929  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:31.702988  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:31.730614  420062 cri.go:89] found id: ""
	I1217 20:35:31.730629  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.730643  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:31.730648  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:31.730715  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:31.757724  420062 cri.go:89] found id: ""
	I1217 20:35:31.757738  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.757745  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:31.757751  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:31.757821  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:31.781313  420062 cri.go:89] found id: ""
	I1217 20:35:31.781326  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.781333  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:31.781338  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:31.781401  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:31.805048  420062 cri.go:89] found id: ""
	I1217 20:35:31.805061  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.805068  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:31.805074  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:31.805133  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:31.829157  420062 cri.go:89] found id: ""
	I1217 20:35:31.829172  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.829178  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:31.829186  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:31.829211  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:31.884232  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:31.884262  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:31.899125  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:31.899143  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:31.960768  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:31.952914   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.953466   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.954986   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.955445   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.957040   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:31.952914   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.953466   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.954986   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.955445   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.957040   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:31.960779  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:31.960789  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:32.026560  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:32.026580  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:34.561956  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:34.573345  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:34.573414  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:34.601971  420062 cri.go:89] found id: ""
	I1217 20:35:34.601985  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.601993  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:34.601998  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:34.602057  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:34.631487  420062 cri.go:89] found id: ""
	I1217 20:35:34.631500  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.631508  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:34.631513  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:34.631572  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:34.656452  420062 cri.go:89] found id: ""
	I1217 20:35:34.656465  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.656473  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:34.656478  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:34.656540  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:34.682582  420062 cri.go:89] found id: ""
	I1217 20:35:34.682596  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.682603  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:34.682609  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:34.682676  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:34.713925  420062 cri.go:89] found id: ""
	I1217 20:35:34.713939  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.713947  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:34.713952  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:34.714017  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:34.742385  420062 cri.go:89] found id: ""
	I1217 20:35:34.742400  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.742408  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:34.742414  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:34.742473  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:34.767035  420062 cri.go:89] found id: ""
	I1217 20:35:34.767049  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.767056  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:34.767064  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:34.767075  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:34.822796  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:34.822817  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:34.837590  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:34.837613  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:34.900508  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:34.892940   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.893576   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895113   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895412   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.896864   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:34.892940   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.893576   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895113   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895412   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.896864   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:34.900518  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:34.900529  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:34.962881  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:34.962905  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:37.494984  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:37.505451  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:37.505514  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:37.530852  420062 cri.go:89] found id: ""
	I1217 20:35:37.530866  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.530874  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:37.530885  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:37.530948  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:37.555283  420062 cri.go:89] found id: ""
	I1217 20:35:37.555298  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.555305  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:37.555319  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:37.555384  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:37.580310  420062 cri.go:89] found id: ""
	I1217 20:35:37.580324  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.580342  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:37.580347  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:37.580407  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:37.604561  420062 cri.go:89] found id: ""
	I1217 20:35:37.604575  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.604582  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:37.604587  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:37.604649  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:37.633577  420062 cri.go:89] found id: ""
	I1217 20:35:37.633591  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.633598  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:37.633603  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:37.633668  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:37.659137  420062 cri.go:89] found id: ""
	I1217 20:35:37.659152  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.659159  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:37.659183  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:37.659280  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:37.687689  420062 cri.go:89] found id: ""
	I1217 20:35:37.687704  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.687711  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:37.687719  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:37.687738  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:37.742459  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:37.742478  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:37.757175  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:37.757191  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:37.822005  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:37.813077   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.813679   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.815702   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.816474   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.817981   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:37.813077   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.813679   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.815702   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.816474   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.817981   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:37.822015  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:37.822025  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:37.885848  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:37.885870  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:40.416602  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:40.427031  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:40.427099  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:40.452190  420062 cri.go:89] found id: ""
	I1217 20:35:40.452204  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.452212  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:40.452218  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:40.452299  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:40.478942  420062 cri.go:89] found id: ""
	I1217 20:35:40.478956  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.478963  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:40.478969  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:40.479027  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:40.504873  420062 cri.go:89] found id: ""
	I1217 20:35:40.504886  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.504893  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:40.504898  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:40.504958  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:40.530153  420062 cri.go:89] found id: ""
	I1217 20:35:40.530167  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.530173  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:40.530179  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:40.530239  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:40.558703  420062 cri.go:89] found id: ""
	I1217 20:35:40.558717  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.558725  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:40.558731  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:40.558799  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:40.583753  420062 cri.go:89] found id: ""
	I1217 20:35:40.583768  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.583777  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:40.583793  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:40.583856  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:40.608061  420062 cri.go:89] found id: ""
	I1217 20:35:40.608075  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.608083  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:40.608099  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:40.608111  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:40.665201  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:40.665222  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:40.680290  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:40.680307  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:40.752424  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:40.739302   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.740073   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.745453   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.746616   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.748372   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:40.739302   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.740073   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.745453   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.746616   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.748372   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:40.752435  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:40.752446  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:40.819510  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:40.819535  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:43.356404  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:43.367228  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:43.367293  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:43.391809  420062 cri.go:89] found id: ""
	I1217 20:35:43.391824  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.391831  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:43.391836  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:43.391895  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:43.417869  420062 cri.go:89] found id: ""
	I1217 20:35:43.417883  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.417890  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:43.417895  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:43.417959  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:43.443314  420062 cri.go:89] found id: ""
	I1217 20:35:43.443328  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.443335  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:43.443340  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:43.443400  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:43.469332  420062 cri.go:89] found id: ""
	I1217 20:35:43.469346  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.469352  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:43.469358  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:43.469418  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:43.494242  420062 cri.go:89] found id: ""
	I1217 20:35:43.494256  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.494264  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:43.494277  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:43.494341  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:43.520502  420062 cri.go:89] found id: ""
	I1217 20:35:43.520515  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.520523  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:43.520529  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:43.520592  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:43.549390  420062 cri.go:89] found id: ""
	I1217 20:35:43.549404  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.549411  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:43.549419  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:43.549435  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:43.565708  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:43.565725  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:43.633544  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:43.624678   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.625383   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627234   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627820   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.629497   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:43.624678   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.625383   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627234   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627820   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.629497   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:43.633555  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:43.633567  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:43.696433  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:43.696457  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:43.727227  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:43.727244  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:46.288373  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:46.298318  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:46.298381  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:46.322903  420062 cri.go:89] found id: ""
	I1217 20:35:46.322918  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.322925  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:46.322931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:46.322992  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:46.347241  420062 cri.go:89] found id: ""
	I1217 20:35:46.347253  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.347260  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:46.347265  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:46.347324  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:46.372209  420062 cri.go:89] found id: ""
	I1217 20:35:46.372222  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.372229  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:46.372235  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:46.372313  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:46.399343  420062 cri.go:89] found id: ""
	I1217 20:35:46.399357  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.399365  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:46.399370  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:46.399430  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:46.425023  420062 cri.go:89] found id: ""
	I1217 20:35:46.425036  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.425051  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:46.425057  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:46.425119  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:46.450066  420062 cri.go:89] found id: ""
	I1217 20:35:46.450080  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.450087  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:46.450092  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:46.450153  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:46.474598  420062 cri.go:89] found id: ""
	I1217 20:35:46.474612  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.474619  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:46.474644  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:46.474654  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:46.536781  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:46.536801  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:46.570140  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:46.570155  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:46.628870  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:46.628888  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:46.643875  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:46.643891  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:46.709883  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:46.701485   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.702111   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.703801   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.704133   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.705726   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:46.701485   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.702111   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.703801   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.704133   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.705726   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:49.210139  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:49.220394  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:49.220461  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:49.256343  420062 cri.go:89] found id: ""
	I1217 20:35:49.256358  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.256365  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:49.256370  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:49.256431  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:49.290171  420062 cri.go:89] found id: ""
	I1217 20:35:49.290185  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.290193  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:49.290198  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:49.290261  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:49.320916  420062 cri.go:89] found id: ""
	I1217 20:35:49.320931  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.320939  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:49.320944  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:49.321003  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:49.345394  420062 cri.go:89] found id: ""
	I1217 20:35:49.345408  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.345415  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:49.345421  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:49.345478  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:49.370339  420062 cri.go:89] found id: ""
	I1217 20:35:49.370353  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.370360  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:49.370365  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:49.370424  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:49.394642  420062 cri.go:89] found id: ""
	I1217 20:35:49.394656  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.394663  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:49.394668  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:49.394734  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:49.422548  420062 cri.go:89] found id: ""
	I1217 20:35:49.422562  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.422569  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:49.422577  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:49.422594  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:49.479225  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:49.479246  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:49.494238  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:49.494255  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:49.560086  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:49.552332   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.552825   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554311   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554738   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.556232   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:49.552332   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.552825   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554311   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554738   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.556232   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:49.560096  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:49.560106  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:49.622094  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:49.622114  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:52.150210  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:52.160168  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:52.160231  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:52.184746  420062 cri.go:89] found id: ""
	I1217 20:35:52.184760  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.184767  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:52.184779  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:52.184835  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:52.209501  420062 cri.go:89] found id: ""
	I1217 20:35:52.209515  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.209522  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:52.209528  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:52.209586  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:52.234558  420062 cri.go:89] found id: ""
	I1217 20:35:52.234571  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.234579  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:52.234584  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:52.234654  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:52.265703  420062 cri.go:89] found id: ""
	I1217 20:35:52.265716  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.265724  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:52.265729  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:52.265794  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:52.297248  420062 cri.go:89] found id: ""
	I1217 20:35:52.297263  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.297270  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:52.297275  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:52.297334  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:52.325342  420062 cri.go:89] found id: ""
	I1217 20:35:52.325355  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.325362  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:52.325367  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:52.325433  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:52.349812  420062 cri.go:89] found id: ""
	I1217 20:35:52.349826  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.349843  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:52.349851  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:52.349862  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:52.380735  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:52.380751  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:52.436131  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:52.436151  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:52.451427  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:52.451445  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:52.518482  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:52.509497   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.510168   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.512343   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.513295   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.514564   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:52.509497   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.510168   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.512343   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.513295   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.514564   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:52.518492  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:52.518503  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:55.081073  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:55.091720  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:55.091797  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:55.117311  420062 cri.go:89] found id: ""
	I1217 20:35:55.117325  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.117333  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:55.117338  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:55.117398  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:55.141668  420062 cri.go:89] found id: ""
	I1217 20:35:55.141683  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.141692  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:55.141697  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:55.141760  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:55.166517  420062 cri.go:89] found id: ""
	I1217 20:35:55.166534  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.166541  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:55.166546  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:55.166611  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:55.191282  420062 cri.go:89] found id: ""
	I1217 20:35:55.191296  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.191304  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:55.191309  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:55.191369  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:55.215605  420062 cri.go:89] found id: ""
	I1217 20:35:55.215619  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.215626  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:55.215631  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:55.215690  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:55.247101  420062 cri.go:89] found id: ""
	I1217 20:35:55.247124  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.247132  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:55.247137  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:55.247205  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:55.288704  420062 cri.go:89] found id: ""
	I1217 20:35:55.288718  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.288725  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:55.288732  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:55.288743  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:55.320382  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:55.320398  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:55.379997  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:55.380016  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:55.394762  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:55.394780  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:55.459997  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:55.451538   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.452219   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.453851   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.454661   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.456164   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:55.451538   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.452219   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.453851   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.454661   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.456164   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:55.460007  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:55.460018  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:58.024408  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:58.035410  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:58.035478  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:58.062124  420062 cri.go:89] found id: ""
	I1217 20:35:58.062138  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.062145  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:58.062151  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:58.062211  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:58.088229  420062 cri.go:89] found id: ""
	I1217 20:35:58.088243  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.088270  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:58.088276  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:58.088335  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:58.113240  420062 cri.go:89] found id: ""
	I1217 20:35:58.113255  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.113261  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:58.113266  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:58.113325  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:58.141811  420062 cri.go:89] found id: ""
	I1217 20:35:58.141825  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.141832  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:58.141837  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:58.141897  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:58.170463  420062 cri.go:89] found id: ""
	I1217 20:35:58.170477  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.170484  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:58.170490  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:58.170548  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:58.194647  420062 cri.go:89] found id: ""
	I1217 20:35:58.194670  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.194678  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:58.194684  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:58.194760  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:58.219714  420062 cri.go:89] found id: ""
	I1217 20:35:58.219728  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.219735  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:58.219743  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:58.219754  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:58.263178  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:58.263194  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:58.325412  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:58.325433  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:58.341419  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:58.341435  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:58.403135  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:58.394644   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.395273   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.396931   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.397587   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.399184   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:58.394644   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.395273   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.396931   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.397587   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.399184   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:58.403147  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:58.403163  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:00.965498  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:00.975759  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:00.975820  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:01.000786  420062 cri.go:89] found id: ""
	I1217 20:36:01.000803  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.000811  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:01.000818  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:01.000892  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:01.025695  420062 cri.go:89] found id: ""
	I1217 20:36:01.025709  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.025716  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:01.025721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:01.025784  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:01.054712  420062 cri.go:89] found id: ""
	I1217 20:36:01.054727  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.054734  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:01.054739  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:01.054799  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:01.083318  420062 cri.go:89] found id: ""
	I1217 20:36:01.083332  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.083340  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:01.083345  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:01.083406  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:01.107939  420062 cri.go:89] found id: ""
	I1217 20:36:01.107954  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.107962  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:01.107968  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:01.108030  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:01.134926  420062 cri.go:89] found id: ""
	I1217 20:36:01.134940  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.134947  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:01.134954  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:01.135018  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:01.161095  420062 cri.go:89] found id: ""
	I1217 20:36:01.161111  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.161121  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:01.161130  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:01.161141  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:01.222094  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:01.222112  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:01.239432  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:01.239449  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:01.331243  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:01.322562   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.323102   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.324888   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.325430   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.327051   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:01.322562   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.323102   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.324888   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.325430   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.327051   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:01.331254  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:01.331265  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:01.398128  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:01.398148  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:03.929660  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:03.940045  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:03.940111  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:03.963644  420062 cri.go:89] found id: ""
	I1217 20:36:03.963658  420062 logs.go:282] 0 containers: []
	W1217 20:36:03.963665  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:03.963670  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:03.963727  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:03.996893  420062 cri.go:89] found id: ""
	I1217 20:36:03.996907  420062 logs.go:282] 0 containers: []
	W1217 20:36:03.996914  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:03.996919  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:03.996987  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:04.028499  420062 cri.go:89] found id: ""
	I1217 20:36:04.028514  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.028530  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:04.028535  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:04.028607  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:04.054700  420062 cri.go:89] found id: ""
	I1217 20:36:04.054715  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.054723  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:04.054728  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:04.054785  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:04.082040  420062 cri.go:89] found id: ""
	I1217 20:36:04.082054  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.082063  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:04.082068  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:04.082131  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:04.107015  420062 cri.go:89] found id: ""
	I1217 20:36:04.107029  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.107037  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:04.107043  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:04.107109  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:04.134634  420062 cri.go:89] found id: ""
	I1217 20:36:04.134648  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.134655  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:04.134663  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:04.134673  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:04.191059  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:04.191079  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:04.206280  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:04.206298  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:04.297698  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:04.288551   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.289379   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.290529   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.291295   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.292969   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:04.288551   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.289379   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.290529   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.291295   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.292969   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:04.297708  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:04.297719  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:04.364378  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:04.364398  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:06.892149  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:06.902353  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:06.902418  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:06.927834  420062 cri.go:89] found id: ""
	I1217 20:36:06.927847  420062 logs.go:282] 0 containers: []
	W1217 20:36:06.927855  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:06.927860  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:06.927925  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:06.952936  420062 cri.go:89] found id: ""
	I1217 20:36:06.952949  420062 logs.go:282] 0 containers: []
	W1217 20:36:06.952956  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:06.952965  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:06.953024  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:06.976184  420062 cri.go:89] found id: ""
	I1217 20:36:06.976198  420062 logs.go:282] 0 containers: []
	W1217 20:36:06.976205  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:06.976210  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:06.976297  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:07.004079  420062 cri.go:89] found id: ""
	I1217 20:36:07.004093  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.004101  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:07.004106  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:07.004167  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:07.029604  420062 cri.go:89] found id: ""
	I1217 20:36:07.029618  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.029625  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:07.029630  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:07.029698  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:07.058618  420062 cri.go:89] found id: ""
	I1217 20:36:07.058637  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.058645  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:07.058650  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:07.058709  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:07.085932  420062 cri.go:89] found id: ""
	I1217 20:36:07.085946  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.085953  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:07.085961  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:07.085972  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:07.100543  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:07.100561  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:07.162557  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:07.154011   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.154703   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.156341   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.157015   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.158662   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:07.154011   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.154703   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.156341   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.157015   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.158662   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:07.162567  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:07.162578  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:07.226244  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:07.226265  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:07.280558  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:07.280574  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:09.844282  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:09.854593  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:09.854676  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:09.883180  420062 cri.go:89] found id: ""
	I1217 20:36:09.883194  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.883202  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:09.883208  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:09.883268  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:09.907225  420062 cri.go:89] found id: ""
	I1217 20:36:09.907240  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.907248  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:09.907254  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:09.907315  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:09.936079  420062 cri.go:89] found id: ""
	I1217 20:36:09.936093  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.936100  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:09.936105  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:09.936167  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:09.961921  420062 cri.go:89] found id: ""
	I1217 20:36:09.961935  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.961943  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:09.961949  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:09.962028  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:09.989285  420062 cri.go:89] found id: ""
	I1217 20:36:09.989299  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.989307  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:09.989312  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:09.989371  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:10.023888  420062 cri.go:89] found id: ""
	I1217 20:36:10.023905  420062 logs.go:282] 0 containers: []
	W1217 20:36:10.023913  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:10.023920  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:10.023992  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:10.056062  420062 cri.go:89] found id: ""
	I1217 20:36:10.056077  420062 logs.go:282] 0 containers: []
	W1217 20:36:10.056084  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:10.056102  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:10.056112  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:10.118144  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:10.118165  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:10.153504  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:10.153521  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:10.209909  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:10.209931  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:10.224930  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:10.224946  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:10.310457  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:10.302878   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.303301   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304492   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304881   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.306456   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:10.302878   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.303301   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304492   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304881   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.306456   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:12.811296  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:12.821279  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:12.821339  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:12.845496  420062 cri.go:89] found id: ""
	I1217 20:36:12.845510  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.845519  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:12.845524  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:12.845582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:12.873951  420062 cri.go:89] found id: ""
	I1217 20:36:12.873966  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.873973  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:12.873978  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:12.874039  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:12.898560  420062 cri.go:89] found id: ""
	I1217 20:36:12.898573  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.898580  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:12.898586  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:12.898661  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:12.931323  420062 cri.go:89] found id: ""
	I1217 20:36:12.931343  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.931350  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:12.931356  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:12.931416  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:12.957667  420062 cri.go:89] found id: ""
	I1217 20:36:12.957680  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.957687  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:12.957692  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:12.957749  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:12.981848  420062 cri.go:89] found id: ""
	I1217 20:36:12.981863  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.981870  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:12.981876  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:12.981934  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:13.007649  420062 cri.go:89] found id: ""
	I1217 20:36:13.007664  420062 logs.go:282] 0 containers: []
	W1217 20:36:13.007671  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:13.007679  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:13.007689  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:13.070827  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:13.070846  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:13.098938  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:13.098954  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:13.155232  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:13.155253  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:13.170218  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:13.170234  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:13.237601  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:13.228684   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.229296   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.230990   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.231505   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.233204   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:13.228684   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.229296   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.230990   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.231505   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.233204   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:15.739451  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:15.749635  420062 kubeadm.go:602] duration metric: took 4m4.768391835s to restartPrimaryControlPlane
	W1217 20:36:15.749706  420062 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1217 20:36:15.749781  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1217 20:36:16.165425  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1217 20:36:16.179463  420062 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1217 20:36:16.187987  420062 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1217 20:36:16.188041  420062 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 20:36:16.195805  420062 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1217 20:36:16.195815  420062 kubeadm.go:158] found existing configuration files:
	
	I1217 20:36:16.195868  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1217 20:36:16.203578  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1217 20:36:16.203633  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1217 20:36:16.211222  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1217 20:36:16.218882  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1217 20:36:16.218939  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 20:36:16.226500  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1217 20:36:16.233980  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1217 20:36:16.234040  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 20:36:16.241486  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1217 20:36:16.250121  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1217 20:36:16.250177  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 20:36:16.257963  420062 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1217 20:36:16.296719  420062 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1217 20:36:16.297028  420062 kubeadm.go:319] [preflight] Running pre-flight checks
	I1217 20:36:16.367021  420062 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1217 20:36:16.367085  420062 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1217 20:36:16.367119  420062 kubeadm.go:319] OS: Linux
	I1217 20:36:16.367163  420062 kubeadm.go:319] CGROUPS_CPU: enabled
	I1217 20:36:16.367211  420062 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1217 20:36:16.367257  420062 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1217 20:36:16.367304  420062 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1217 20:36:16.367351  420062 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1217 20:36:16.367397  420062 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1217 20:36:16.367441  420062 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1217 20:36:16.367493  420062 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1217 20:36:16.367539  420062 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1217 20:36:16.443855  420062 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1217 20:36:16.443958  420062 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1217 20:36:16.444047  420062 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1217 20:36:16.456800  420062 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1217 20:36:16.459720  420062 out.go:252]   - Generating certificates and keys ...
	I1217 20:36:16.459808  420062 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1217 20:36:16.459875  420062 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1217 20:36:16.459957  420062 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1217 20:36:16.460026  420062 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1217 20:36:16.460100  420062 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1217 20:36:16.460156  420062 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1217 20:36:16.460222  420062 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1217 20:36:16.460299  420062 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1217 20:36:16.460377  420062 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1217 20:36:16.460454  420062 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1217 20:36:16.460493  420062 kubeadm.go:319] [certs] Using the existing "sa" key
	I1217 20:36:16.460552  420062 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1217 20:36:16.591707  420062 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1217 20:36:16.773515  420062 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1217 20:36:16.895942  420062 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1217 20:36:17.316963  420062 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1217 20:36:17.418134  420062 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1217 20:36:17.418872  420062 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1217 20:36:17.421748  420062 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1217 20:36:17.424898  420062 out.go:252]   - Booting up control plane ...
	I1217 20:36:17.424999  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1217 20:36:17.425075  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1217 20:36:17.425522  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1217 20:36:17.446706  420062 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1217 20:36:17.446809  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1217 20:36:17.455830  420062 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1217 20:36:17.455925  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1217 20:36:17.455963  420062 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1217 20:36:17.596746  420062 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1217 20:36:17.596869  420062 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1217 20:40:17.595000  420062 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000220112s
	I1217 20:40:17.595032  420062 kubeadm.go:319] 
	I1217 20:40:17.595086  420062 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1217 20:40:17.595116  420062 kubeadm.go:319] 	- The kubelet is not running
	I1217 20:40:17.595215  420062 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1217 20:40:17.595220  420062 kubeadm.go:319] 
	I1217 20:40:17.595317  420062 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1217 20:40:17.595346  420062 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1217 20:40:17.595375  420062 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1217 20:40:17.595378  420062 kubeadm.go:319] 
	I1217 20:40:17.599582  420062 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1217 20:40:17.600077  420062 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1217 20:40:17.600181  420062 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1217 20:40:17.600461  420062 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1217 20:40:17.600468  420062 kubeadm.go:319] 
	I1217 20:40:17.600540  420062 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1217 20:40:17.600694  420062 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000220112s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1217 20:40:17.600780  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1217 20:40:18.014309  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1217 20:40:18.029681  420062 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1217 20:40:18.029742  420062 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 20:40:18.038728  420062 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1217 20:40:18.038739  420062 kubeadm.go:158] found existing configuration files:
	
	I1217 20:40:18.038796  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1217 20:40:18.047726  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1217 20:40:18.047785  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1217 20:40:18.056139  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1217 20:40:18.064964  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1217 20:40:18.065020  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 20:40:18.073071  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1217 20:40:18.081347  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1217 20:40:18.081407  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 20:40:18.089386  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1217 20:40:18.097546  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1217 20:40:18.097608  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 20:40:18.105445  420062 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1217 20:40:18.146508  420062 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1217 20:40:18.146883  420062 kubeadm.go:319] [preflight] Running pre-flight checks
	I1217 20:40:18.223079  420062 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1217 20:40:18.223139  420062 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1217 20:40:18.223171  420062 kubeadm.go:319] OS: Linux
	I1217 20:40:18.223212  420062 kubeadm.go:319] CGROUPS_CPU: enabled
	I1217 20:40:18.223257  420062 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1217 20:40:18.223306  420062 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1217 20:40:18.223354  420062 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1217 20:40:18.223398  420062 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1217 20:40:18.223442  420062 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1217 20:40:18.223484  420062 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1217 20:40:18.223529  420062 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1217 20:40:18.223571  420062 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1217 20:40:18.290116  420062 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1217 20:40:18.290214  420062 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1217 20:40:18.290297  420062 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1217 20:40:18.296827  420062 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1217 20:40:18.300313  420062 out.go:252]   - Generating certificates and keys ...
	I1217 20:40:18.300404  420062 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1217 20:40:18.300483  420062 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1217 20:40:18.300564  420062 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1217 20:40:18.300623  420062 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1217 20:40:18.300692  420062 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1217 20:40:18.300745  420062 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1217 20:40:18.300806  420062 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1217 20:40:18.300867  420062 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1217 20:40:18.300940  420062 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1217 20:40:18.301011  420062 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1217 20:40:18.301047  420062 kubeadm.go:319] [certs] Using the existing "sa" key
	I1217 20:40:18.301101  420062 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1217 20:40:18.651136  420062 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1217 20:40:18.865861  420062 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1217 20:40:19.156184  420062 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1217 20:40:19.613234  420062 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1217 20:40:19.777874  420062 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1217 20:40:19.778689  420062 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1217 20:40:19.781521  420062 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1217 20:40:19.784636  420062 out.go:252]   - Booting up control plane ...
	I1217 20:40:19.784726  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1217 20:40:19.784798  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1217 20:40:19.786110  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1217 20:40:19.806173  420062 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1217 20:40:19.806463  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1217 20:40:19.814039  420062 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1217 20:40:19.814294  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1217 20:40:19.814465  420062 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1217 20:40:19.960654  420062 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1217 20:40:19.960777  420062 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1217 20:44:19.954818  420062 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001239508s
	I1217 20:44:19.954843  420062 kubeadm.go:319] 
	I1217 20:44:19.954896  420062 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1217 20:44:19.954927  420062 kubeadm.go:319] 	- The kubelet is not running
	I1217 20:44:19.955102  420062 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1217 20:44:19.955108  420062 kubeadm.go:319] 
	I1217 20:44:19.955205  420062 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1217 20:44:19.955233  420062 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1217 20:44:19.955262  420062 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1217 20:44:19.955265  420062 kubeadm.go:319] 
	I1217 20:44:19.960153  420062 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1217 20:44:19.960582  420062 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1217 20:44:19.960689  420062 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1217 20:44:19.960924  420062 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1217 20:44:19.960929  420062 kubeadm.go:319] 
	I1217 20:44:19.960996  420062 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1217 20:44:19.961048  420062 kubeadm.go:403] duration metric: took 12m9.01968184s to StartCluster
	I1217 20:44:19.961079  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:44:19.961139  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:44:19.999166  420062 cri.go:89] found id: ""
	I1217 20:44:19.999182  420062 logs.go:282] 0 containers: []
	W1217 20:44:19.999190  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:44:19.999195  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:44:19.999265  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:44:20.031203  420062 cri.go:89] found id: ""
	I1217 20:44:20.031218  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.031225  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:44:20.031230  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:44:20.031293  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:44:20.061179  420062 cri.go:89] found id: ""
	I1217 20:44:20.061193  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.061200  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:44:20.061219  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:44:20.061280  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:44:20.089093  420062 cri.go:89] found id: ""
	I1217 20:44:20.089107  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.089114  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:44:20.089120  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:44:20.089183  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:44:20.119683  420062 cri.go:89] found id: ""
	I1217 20:44:20.119696  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.119704  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:44:20.119709  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:44:20.119772  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:44:20.145500  420062 cri.go:89] found id: ""
	I1217 20:44:20.145514  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.145521  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:44:20.145526  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:44:20.145586  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:44:20.170345  420062 cri.go:89] found id: ""
	I1217 20:44:20.170359  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.170367  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:44:20.170377  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:44:20.170387  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:44:20.226476  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:44:20.226496  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:44:20.241970  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:44:20.241987  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:44:20.311525  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:44:20.302109   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.302950   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.304712   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.305374   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.307049   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:44:20.302109   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.302950   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.304712   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.305374   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.307049   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:44:20.311535  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:44:20.311546  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:44:20.375759  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:44:20.375781  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1217 20:44:20.404823  420062 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001239508s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1217 20:44:20.404857  420062 out.go:285] * 
	W1217 20:44:20.404931  420062 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001239508s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1217 20:44:20.404948  420062 out.go:285] * 
	W1217 20:44:20.407052  420062 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1217 20:44:20.412138  420062 out.go:203] 
	W1217 20:44:20.415946  420062 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001239508s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1217 20:44:20.415994  420062 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1217 20:44:20.416018  420062 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1217 20:44:20.419093  420062 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 17 20:44:29 functional-682596 containerd[9792]: time="2025-12-17T20:44:29.485700024Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-682596\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.528847726Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\""
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.532364395Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.541542678Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.562388570Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\" returns successfully"
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.886013953Z" level=info msg="No images store for sha256:426b8c85f3639ce7684f335da56e517a857cd0c0b418e28f3fce1e3079a57b26"
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.888517483Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.896652382Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.896979157Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-682596\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.215090972Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\""
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.217900174Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.221232159Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.234208610Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\" returns successfully"
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.526440562Z" level=info msg="No images store for sha256:426b8c85f3639ce7684f335da56e517a857cd0c0b418e28f3fce1e3079a57b26"
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.528721959Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.537888598Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.538209006Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-682596\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:33 functional-682596 containerd[9792]: time="2025-12-17T20:44:33.567338451Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\""
	Dec 17 20:44:33 functional-682596 containerd[9792]: time="2025-12-17T20:44:33.569906392Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:33 functional-682596 containerd[9792]: time="2025-12-17T20:44:33.572864134Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 17 20:44:33 functional-682596 containerd[9792]: time="2025-12-17T20:44:33.580393179Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\" returns successfully"
	Dec 17 20:44:34 functional-682596 containerd[9792]: time="2025-12-17T20:44:34.405679689Z" level=info msg="No images store for sha256:05371fd6ad950eede907960b388fa9b50b39adf62f93dec0b13c9fc4ce7e1bc1"
	Dec 17 20:44:34 functional-682596 containerd[9792]: time="2025-12-17T20:44:34.408072965Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:34 functional-682596 containerd[9792]: time="2025-12-17T20:44:34.415708801Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:34 functional-682596 containerd[9792]: time="2025-12-17T20:44:34.416196737Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-682596\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:46:50.531282   23697 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:46:50.532533   23697 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:46:50.534325   23697 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:46:50.534654   23697 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:46:50.536153   23697 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec17 17:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015536] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.514164] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034184] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.806183] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.649674] kauditd_printk_skb: 36 callbacks suppressed
	[Dec17 19:37] hrtimer: interrupt took 15014583 ns
	[Dec17 19:39] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:06] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:17] FS-Cache: Duplicate cookie detected
	[  +0.000767] FS-Cache: O-cookie c=00000031 [p=00000002 fl=222 nc=0 na=1]
	[  +0.001036] FS-Cache: O-cookie d=00000000b1f70094{9P.session} n=000000004124fba5
	[  +0.001177] FS-Cache: O-key=[10] '34323937353834383437'
	[  +0.000816] FS-Cache: N-cookie c=00000032 [p=00000002 fl=2 nc=0 na=1]
	[  +0.001043] FS-Cache: N-cookie d=00000000b1f70094{9P.session} n=000000009cece4cf
	[  +0.001160] FS-Cache: N-key=[10] '34323937353834383437'
	
	
	==> kernel <==
	 20:46:50 up  3:29,  0 user,  load average: 0.48, 0.49, 0.53
	Linux functional-682596 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 17 20:46:47 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:46:48 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 517.
	Dec 17 20:46:48 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:48 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:48 functional-682596 kubelet[23525]: E1217 20:46:48.286211   23525 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:46:48 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:46:48 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:46:48 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 518.
	Dec 17 20:46:48 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:48 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:48 functional-682596 kubelet[23572]: E1217 20:46:48.986467   23572 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:46:48 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:46:48 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:46:49 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 519.
	Dec 17 20:46:49 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:49 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:49 functional-682596 kubelet[23610]: E1217 20:46:49.803276   23610 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:46:49 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:46:49 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:46:50 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 520.
	Dec 17 20:46:50 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:50 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:50 functional-682596 kubelet[23702]: E1217 20:46:50.549386   23702 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:46:50 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:46:50 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596: exit status 2 (379.080908ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-682596" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd (3.10s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect (2.51s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect
functional_test.go:1636: (dbg) Run:  kubectl --context functional-682596 create deployment hello-node-connect --image kicbase/echo-server
functional_test.go:1636: (dbg) Non-zero exit: kubectl --context functional-682596 create deployment hello-node-connect --image kicbase/echo-server: exit status 1 (55.784144ms)

                                                
                                                
** stderr ** 
	error: failed to create deployment: Post "https://192.168.49.2:8441/apis/apps/v1/namespaces/default/deployments?fieldManager=kubectl-create&fieldValidation=Strict": dial tcp 192.168.49.2:8441: connect: connection refused

                                                
                                                
** /stderr **
functional_test.go:1638: failed to create hello-node deployment with this command "kubectl --context functional-682596 create deployment hello-node-connect --image kicbase/echo-server": exit status 1.
functional_test.go:1608: service test failed - dumping debug information
functional_test.go:1609: -----------------------service failure post-mortem--------------------------------
functional_test.go:1612: (dbg) Run:  kubectl --context functional-682596 describe po hello-node-connect
functional_test.go:1612: (dbg) Non-zero exit: kubectl --context functional-682596 describe po hello-node-connect: exit status 1 (64.468316ms)

                                                
                                                
** stderr ** 
	E1217 20:46:45.988935  436646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:46:45.990513  436646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:46:45.991904  436646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:46:45.993348  436646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:46:45.994753  436646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:1614: "kubectl --context functional-682596 describe po hello-node-connect" failed: exit status 1
functional_test.go:1616: hello-node pod describe:
functional_test.go:1618: (dbg) Run:  kubectl --context functional-682596 logs -l app=hello-node-connect
functional_test.go:1618: (dbg) Non-zero exit: kubectl --context functional-682596 logs -l app=hello-node-connect: exit status 1 (86.484469ms)

                                                
                                                
** stderr ** 
	E1217 20:46:46.076730  436656 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:46:46.078291  436656 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:46:46.079782  436656 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:46:46.081299  436656 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:1620: "kubectl --context functional-682596 logs -l app=hello-node-connect" failed: exit status 1
functional_test.go:1622: hello-node logs:
functional_test.go:1624: (dbg) Run:  kubectl --context functional-682596 describe svc hello-node-connect
functional_test.go:1624: (dbg) Non-zero exit: kubectl --context functional-682596 describe svc hello-node-connect: exit status 1 (65.152741ms)

                                                
                                                
** stderr ** 
	E1217 20:46:46.139626  436662 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:46:46.141180  436662 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:46:46.142771  436662 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:46:46.144337  436662 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:46:46.145857  436662 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:1626: "kubectl --context functional-682596 describe svc hello-node-connect" failed: exit status 1
functional_test.go:1628: hello-node svc describe:
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-682596
helpers_test.go:244: (dbg) docker inspect functional-682596:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	        "Created": "2025-12-17T20:17:26.774929696Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 408854,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-17T20:17:26.844564666Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:2a6398fc76fc21dc0a77ac54600c2604c101bff52e66ecf65f88ec0f1a8cff2d",
	        "ResolvConfPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hostname",
	        "HostsPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hosts",
	        "LogPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77-json.log",
	        "Name": "/functional-682596",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-682596:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-682596",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	                "LowerDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268-init/diff:/var/lib/docker/overlay2/83c8e6311894730d80a5439b5d4991744e9cfa6d0015df9caca346d57baf92e8/diff",
	                "MergedDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/merged",
	                "UpperDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/diff",
	                "WorkDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-682596",
	                "Source": "/var/lib/docker/volumes/functional-682596/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-682596",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-682596",
	                "name.minikube.sigs.k8s.io": "functional-682596",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "8e0f8d4915f888f90df7adb000bd0e749885d304e33053e85751193487b627b9",
	            "SandboxKey": "/var/run/docker/netns/8e0f8d4915f8",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33163"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33164"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33167"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33165"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33166"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-682596": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "de:95:c1:d9:d4:32",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "9e66e4dbc8284f728f81715f37c51d8272e96fcac9fb378874c982b3077b6cc2",
	                    "EndpointID": "0db3c56cfb2be75c981ed53adcc07de7cd33db60d51c01b0e875c8d41cf02897",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-682596",
	                        "efc9468a7e55"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596: exit status 2 (291.821421ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh     │ functional-682596 ssh sudo cat /usr/share/ca-certificates/369461.pem                                                                                            │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ ssh     │ functional-682596 ssh sudo cat /etc/ssl/certs/51391683.0                                                                                                        │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ image   │ functional-682596 image ls                                                                                                                                      │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ ssh     │ functional-682596 ssh sudo cat /etc/ssl/certs/3694612.pem                                                                                                       │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ image   │ functional-682596 image save kicbase/echo-server:functional-682596 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ ssh     │ functional-682596 ssh sudo cat /usr/share/ca-certificates/3694612.pem                                                                                           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ image   │ functional-682596 image rm kicbase/echo-server:functional-682596 --alsologtostderr                                                                              │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ ssh     │ functional-682596 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                        │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ image   │ functional-682596 image ls                                                                                                                                      │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ ssh     │ functional-682596 ssh sudo cat /etc/test/nested/copy/369461/hosts                                                                                               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ image   │ functional-682596 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ service │ functional-682596 service list                                                                                                                                  │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ image   │ functional-682596 image ls                                                                                                                                      │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ service │ functional-682596 service list -o json                                                                                                                          │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ image   │ functional-682596 image save --daemon kicbase/echo-server:functional-682596 --alsologtostderr                                                                   │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ service │ functional-682596 service --namespace=default --https --url hello-node                                                                                          │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ ssh     │ functional-682596 ssh echo hello                                                                                                                                │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ service │ functional-682596 service hello-node --url --format={{.IP}}                                                                                                     │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ ssh     │ functional-682596 ssh cat /etc/hostname                                                                                                                         │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ service │ functional-682596 service hello-node --url                                                                                                                      │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ tunnel  │ functional-682596 tunnel --alsologtostderr                                                                                                                      │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ tunnel  │ functional-682596 tunnel --alsologtostderr                                                                                                                      │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ tunnel  │ functional-682596 tunnel --alsologtostderr                                                                                                                      │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ addons  │ functional-682596 addons list                                                                                                                                   │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │ 17 Dec 25 20:46 UTC │
	│ addons  │ functional-682596 addons list -o json                                                                                                                           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │ 17 Dec 25 20:46 UTC │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/17 20:32:06
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1217 20:32:06.395598  420062 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:32:06.395704  420062 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:32:06.395708  420062 out.go:374] Setting ErrFile to fd 2...
	I1217 20:32:06.395712  420062 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:32:06.395972  420062 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:32:06.396388  420062 out.go:368] Setting JSON to false
	I1217 20:32:06.397206  420062 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":11672,"bootTime":1765991855,"procs":154,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:32:06.397266  420062 start.go:143] virtualization:  
	I1217 20:32:06.400889  420062 out.go:179] * [functional-682596] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1217 20:32:06.403953  420062 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 20:32:06.404019  420062 notify.go:221] Checking for updates...
	I1217 20:32:06.410244  420062 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:32:06.413231  420062 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:32:06.416152  420062 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:32:06.419145  420062 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 20:32:06.422186  420062 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 20:32:06.425355  420062 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:32:06.425444  420062 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:32:06.459431  420062 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:32:06.459555  420062 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:32:06.531840  420062 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-17 20:32:06.520070933 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:32:06.531937  420062 docker.go:319] overlay module found
	I1217 20:32:06.535075  420062 out.go:179] * Using the docker driver based on existing profile
	I1217 20:32:06.538013  420062 start.go:309] selected driver: docker
	I1217 20:32:06.538025  420062 start.go:927] validating driver "docker" against &{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableC
oreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:32:06.538123  420062 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 20:32:06.538239  420062 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:32:06.599898  420062 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-17 20:32:06.590438982 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:32:06.600362  420062 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1217 20:32:06.600387  420062 cni.go:84] Creating CNI manager for ""
	I1217 20:32:06.600439  420062 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:32:06.600480  420062 start.go:353] cluster config:
	{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCo
reDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:32:06.605529  420062 out.go:179] * Starting "functional-682596" primary control-plane node in "functional-682596" cluster
	I1217 20:32:06.608314  420062 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1217 20:32:06.611190  420062 out.go:179] * Pulling base image v0.0.48-1765661130-22141 ...
	I1217 20:32:06.614228  420062 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:32:06.614282  420062 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1217 20:32:06.614283  420062 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon
	I1217 20:32:06.614291  420062 cache.go:65] Caching tarball of preloaded images
	I1217 20:32:06.614394  420062 preload.go:238] Found /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1217 20:32:06.614404  420062 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1217 20:32:06.614527  420062 profile.go:143] Saving config to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/config.json ...
	I1217 20:32:06.634867  420062 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon, skipping pull
	I1217 20:32:06.634879  420062 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 exists in daemon, skipping load
	I1217 20:32:06.634892  420062 cache.go:243] Successfully downloaded all kic artifacts
	I1217 20:32:06.634927  420062 start.go:360] acquireMachinesLock for functional-682596: {Name:mk49b95a4c72eb2d15a1ae0f35918a9843d0b3df Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1217 20:32:06.634983  420062 start.go:364] duration metric: took 39.828µs to acquireMachinesLock for "functional-682596"
	I1217 20:32:06.635002  420062 start.go:96] Skipping create...Using existing machine configuration
	I1217 20:32:06.635007  420062 fix.go:54] fixHost starting: 
	I1217 20:32:06.635262  420062 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:32:06.652755  420062 fix.go:112] recreateIfNeeded on functional-682596: state=Running err=<nil>
	W1217 20:32:06.652776  420062 fix.go:138] unexpected machine state, will restart: <nil>
	I1217 20:32:06.656001  420062 out.go:252] * Updating the running docker "functional-682596" container ...
	I1217 20:32:06.656027  420062 machine.go:94] provisionDockerMachine start ...
	I1217 20:32:06.656117  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:06.673371  420062 main.go:143] libmachine: Using SSH client type: native
	I1217 20:32:06.673711  420062 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:32:06.673717  420062 main.go:143] libmachine: About to run SSH command:
	hostname
	I1217 20:32:06.807817  420062 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:32:06.807832  420062 ubuntu.go:182] provisioning hostname "functional-682596"
	I1217 20:32:06.807905  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:06.825970  420062 main.go:143] libmachine: Using SSH client type: native
	I1217 20:32:06.826266  420062 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:32:06.826274  420062 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-682596 && echo "functional-682596" | sudo tee /etc/hostname
	I1217 20:32:06.965026  420062 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:32:06.965108  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:06.983394  420062 main.go:143] libmachine: Using SSH client type: native
	I1217 20:32:06.983695  420062 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:32:06.983710  420062 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-682596' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-682596/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-682596' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1217 20:32:07.116833  420062 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1217 20:32:07.116850  420062 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21808-367595/.minikube CaCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21808-367595/.minikube}
	I1217 20:32:07.116869  420062 ubuntu.go:190] setting up certificates
	I1217 20:32:07.116877  420062 provision.go:84] configureAuth start
	I1217 20:32:07.116947  420062 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:32:07.134531  420062 provision.go:143] copyHostCerts
	I1217 20:32:07.134601  420062 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem, removing ...
	I1217 20:32:07.134608  420062 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem
	I1217 20:32:07.134696  420062 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem (1082 bytes)
	I1217 20:32:07.134816  420062 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem, removing ...
	I1217 20:32:07.134820  420062 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem
	I1217 20:32:07.134849  420062 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem (1123 bytes)
	I1217 20:32:07.134907  420062 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem, removing ...
	I1217 20:32:07.134911  420062 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem
	I1217 20:32:07.134937  420062 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem (1679 bytes)
	I1217 20:32:07.134994  420062 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem org=jenkins.functional-682596 san=[127.0.0.1 192.168.49.2 functional-682596 localhost minikube]
	I1217 20:32:07.402222  420062 provision.go:177] copyRemoteCerts
	I1217 20:32:07.402275  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1217 20:32:07.402313  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.421789  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.516787  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1217 20:32:07.535734  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1217 20:32:07.553569  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1217 20:32:07.572193  420062 provision.go:87] duration metric: took 455.301945ms to configureAuth
	I1217 20:32:07.572211  420062 ubuntu.go:206] setting minikube options for container-runtime
	I1217 20:32:07.572513  420062 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:32:07.572520  420062 machine.go:97] duration metric: took 916.488302ms to provisionDockerMachine
	I1217 20:32:07.572527  420062 start.go:293] postStartSetup for "functional-682596" (driver="docker")
	I1217 20:32:07.572544  420062 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1217 20:32:07.572595  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1217 20:32:07.572635  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.593078  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.688373  420062 ssh_runner.go:195] Run: cat /etc/os-release
	I1217 20:32:07.691957  420062 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1217 20:32:07.691978  420062 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1217 20:32:07.691989  420062 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/addons for local assets ...
	I1217 20:32:07.692044  420062 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/files for local assets ...
	I1217 20:32:07.692122  420062 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> 3694612.pem in /etc/ssl/certs
	I1217 20:32:07.692197  420062 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts -> hosts in /etc/test/nested/copy/369461
	I1217 20:32:07.692238  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/369461
	I1217 20:32:07.699873  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:32:07.718147  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts --> /etc/test/nested/copy/369461/hosts (40 bytes)
	I1217 20:32:07.736089  420062 start.go:296] duration metric: took 163.546649ms for postStartSetup
	I1217 20:32:07.736163  420062 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1217 20:32:07.736210  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.753837  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.845496  420062 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1217 20:32:07.850448  420062 fix.go:56] duration metric: took 1.215434362s for fixHost
	I1217 20:32:07.850463  420062 start.go:83] releasing machines lock for "functional-682596", held for 1.215473649s
	I1217 20:32:07.850551  420062 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:32:07.871450  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:32:07.871498  420062 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:32:07.871505  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:32:07.871531  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:32:07.871602  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:32:07.871627  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:32:07.871680  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:32:07.871748  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:32:07.871798  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.889554  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.998672  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:32:08.024673  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:32:08.048014  420062 ssh_runner.go:195] Run: openssl version
	I1217 20:32:08.055454  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.065155  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:32:08.073391  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.077720  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.077778  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.119356  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:32:08.127518  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.135465  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:32:08.143207  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.147322  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.147376  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.188376  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:32:08.196028  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.203401  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:32:08.211111  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.214821  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.214891  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.256072  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:32:08.263331  420062 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-certificates >/dev/null 2>&1 && sudo update-ca-certificates || true"
	I1217 20:32:08.266724  420062 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-trust >/dev/null 2>&1 && sudo update-ca-trust extract || true"
	I1217 20:32:08.270040  420062 ssh_runner.go:195] Run: cat /version.json
	I1217 20:32:08.270111  420062 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1217 20:32:08.361093  420062 ssh_runner.go:195] Run: systemctl --version
	I1217 20:32:08.367706  420062 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1217 20:32:08.372063  420062 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1217 20:32:08.372127  420062 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1217 20:32:08.380119  420062 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1217 20:32:08.380133  420062 start.go:496] detecting cgroup driver to use...
	I1217 20:32:08.380163  420062 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1217 20:32:08.380223  420062 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1217 20:32:08.395765  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1217 20:32:08.409064  420062 docker.go:218] disabling cri-docker service (if available) ...
	I1217 20:32:08.409142  420062 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1217 20:32:08.425141  420062 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1217 20:32:08.438808  420062 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1217 20:32:08.558555  420062 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1217 20:32:08.681937  420062 docker.go:234] disabling docker service ...
	I1217 20:32:08.681997  420062 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1217 20:32:08.701323  420062 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1217 20:32:08.715923  420062 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1217 20:32:08.835610  420062 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1217 20:32:08.958372  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1217 20:32:08.972822  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1217 20:32:08.987570  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1217 20:32:08.997169  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1217 20:32:09.008742  420062 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1217 20:32:09.008821  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1217 20:32:09.018997  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:32:09.028318  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1217 20:32:09.037280  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:32:09.046375  420062 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1217 20:32:09.054925  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1217 20:32:09.064191  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1217 20:32:09.073303  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1217 20:32:09.082553  420062 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1217 20:32:09.090003  420062 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1217 20:32:09.097524  420062 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:32:09.216967  420062 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1217 20:32:09.360558  420062 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1217 20:32:09.360617  420062 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1217 20:32:09.364443  420062 start.go:564] Will wait 60s for crictl version
	I1217 20:32:09.364497  420062 ssh_runner.go:195] Run: which crictl
	I1217 20:32:09.368129  420062 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1217 20:32:09.397262  420062 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1217 20:32:09.397334  420062 ssh_runner.go:195] Run: containerd --version
	I1217 20:32:09.420778  420062 ssh_runner.go:195] Run: containerd --version
	I1217 20:32:09.446347  420062 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1217 20:32:09.449338  420062 cli_runner.go:164] Run: docker network inspect functional-682596 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1217 20:32:09.466521  420062 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1217 20:32:09.473221  420062 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1217 20:32:09.476024  420062 kubeadm.go:884] updating cluster {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMi
rror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1217 20:32:09.476173  420062 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:32:09.476285  420062 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:32:09.523837  420062 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:32:09.523848  420062 containerd.go:534] Images already preloaded, skipping extraction
	I1217 20:32:09.523905  420062 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:32:09.551003  420062 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:32:09.551014  420062 cache_images.go:86] Images are preloaded, skipping loading
	I1217 20:32:09.551021  420062 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1217 20:32:09.551143  420062 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-682596 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1217 20:32:09.551208  420062 ssh_runner.go:195] Run: sudo crictl info
	I1217 20:32:09.578643  420062 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1217 20:32:09.578665  420062 cni.go:84] Creating CNI manager for ""
	I1217 20:32:09.578673  420062 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:32:09.578683  420062 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1217 20:32:09.578707  420062 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-682596 NodeName:functional-682596 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubelet
ConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1217 20:32:09.578827  420062 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-682596"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1217 20:32:09.578904  420062 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1217 20:32:09.586879  420062 binaries.go:51] Found k8s binaries, skipping transfer
	I1217 20:32:09.586939  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1217 20:32:09.594505  420062 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1217 20:32:09.607281  420062 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1217 20:32:09.619808  420062 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2085 bytes)
	I1217 20:32:09.632685  420062 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1217 20:32:09.636364  420062 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:32:09.746796  420062 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1217 20:32:10.238623  420062 certs.go:69] Setting up /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596 for IP: 192.168.49.2
	I1217 20:32:10.238634  420062 certs.go:195] generating shared ca certs ...
	I1217 20:32:10.238650  420062 certs.go:227] acquiring lock for ca certs: {Name:mk528c7ee25f2f3d78de33f266a77f908cb2a9d0 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:32:10.238819  420062 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key
	I1217 20:32:10.238897  420062 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key
	I1217 20:32:10.238904  420062 certs.go:257] generating profile certs ...
	I1217 20:32:10.238995  420062 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key
	I1217 20:32:10.239044  420062 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key.0c30bf8d
	I1217 20:32:10.239082  420062 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key
	I1217 20:32:10.239190  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:32:10.239221  420062 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:32:10.239227  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:32:10.239261  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:32:10.239282  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:32:10.239304  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:32:10.239345  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:32:10.239934  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1217 20:32:10.261870  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1217 20:32:10.286466  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1217 20:32:10.307033  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1217 20:32:10.325172  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1217 20:32:10.343499  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1217 20:32:10.361814  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1217 20:32:10.379595  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1217 20:32:10.397590  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:32:10.415855  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:32:10.435021  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:32:10.453267  420062 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1217 20:32:10.466474  420062 ssh_runner.go:195] Run: openssl version
	I1217 20:32:10.472863  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.480366  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:32:10.487904  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.491724  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.491791  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.533110  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:32:10.540758  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.548093  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:32:10.555384  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.558983  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.559039  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.602447  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:32:10.609962  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.617251  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:32:10.625102  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.629186  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.629244  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.670572  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:32:10.678295  420062 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1217 20:32:10.682347  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1217 20:32:10.723286  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1217 20:32:10.764614  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1217 20:32:10.806369  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1217 20:32:10.856829  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1217 20:32:10.900136  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1217 20:32:10.941380  420062 kubeadm.go:401] StartCluster: {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirro
r: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:32:10.941458  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1217 20:32:10.941532  420062 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1217 20:32:10.973304  420062 cri.go:89] found id: ""
	I1217 20:32:10.973369  420062 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1217 20:32:10.981213  420062 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1217 20:32:10.981233  420062 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1217 20:32:10.981284  420062 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1217 20:32:10.989643  420062 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:10.990148  420062 kubeconfig.go:125] found "functional-682596" server: "https://192.168.49.2:8441"
	I1217 20:32:10.991404  420062 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1217 20:32:11.001770  420062 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-17 20:17:35.203485302 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-17 20:32:09.624537089 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1217 20:32:11.001793  420062 kubeadm.go:1161] stopping kube-system containers ...
	I1217 20:32:11.001810  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1217 20:32:11.001907  420062 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1217 20:32:11.031815  420062 cri.go:89] found id: ""
	I1217 20:32:11.031894  420062 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1217 20:32:11.052689  420062 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 20:32:11.061497  420062 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5635 Dec 17 20:21 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5640 Dec 17 20:21 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Dec 17 20:21 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Dec 17 20:21 /etc/kubernetes/scheduler.conf
	
	I1217 20:32:11.061561  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1217 20:32:11.069861  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1217 20:32:11.077903  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:11.077964  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 20:32:11.085969  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1217 20:32:11.094098  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:11.094177  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 20:32:11.102002  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1217 20:32:11.110213  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:11.110288  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 20:32:11.119148  420062 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1217 20:32:11.127567  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:11.176595  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.173518  420062 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.996897383s)
	I1217 20:32:13.173578  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.380045  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.450955  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.494559  420062 api_server.go:52] waiting for apiserver process to appear ...
	I1217 20:32:13.494629  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:13.995499  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:14.495246  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:14.995004  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:15.494932  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:15.995036  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:16.495074  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:16.994872  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:17.495380  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:17.995751  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:18.495343  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:18.994970  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:19.494770  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:19.994830  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:20.495505  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:20.994898  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:21.495023  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:21.995478  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:22.495349  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:22.995690  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:23.495439  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:23.995543  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:24.495694  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:24.995422  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:25.495295  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:25.994704  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:26.495710  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:26.995337  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:27.494832  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:27.995523  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:28.494851  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:28.995537  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:29.495464  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:29.994938  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:30.494723  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:30.995506  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:31.494922  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:31.995021  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:32.495513  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:32.995616  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:33.494819  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:33.995255  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:34.495487  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:34.994841  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:35.494829  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:35.994738  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:36.495064  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:36.995222  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:37.495670  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:37.995598  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:38.495022  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:38.994778  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:39.494800  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:39.995546  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:40.495339  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:40.995490  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:41.495730  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:41.995344  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:42.494837  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:42.994782  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:43.495499  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:43.994789  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:44.495147  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:44.994920  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:45.495463  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:45.994922  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:46.495042  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:46.994829  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:47.495629  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:47.994850  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:48.495359  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:48.994705  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:49.494785  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:49.995746  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:50.495699  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:50.994838  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:51.494890  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:51.995223  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:52.495608  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:52.995342  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:53.495633  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:53.994828  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:54.495690  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:54.995411  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:55.495390  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:55.994857  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:56.494814  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:56.995195  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:57.494792  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:57.995068  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:58.494828  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:58.995135  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:59.495101  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:59.994696  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:00.494847  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:00.994832  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:01.495150  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:01.994869  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:02.494983  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:02.995441  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:03.495150  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:03.994800  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:04.494955  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:04.995595  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:05.495571  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:05.995745  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:06.494913  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:06.994802  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:07.494809  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:07.995731  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:08.495034  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:08.995352  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:09.494830  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:09.995574  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:10.495663  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:10.995478  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:11.494754  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:11.995704  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:12.494787  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:12.995364  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:13.495637  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:13.495716  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:13.520703  420062 cri.go:89] found id: ""
	I1217 20:33:13.520717  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.520724  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:13.520729  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:13.520793  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:13.549658  420062 cri.go:89] found id: ""
	I1217 20:33:13.549672  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.549680  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:13.549685  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:13.549748  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:13.574860  420062 cri.go:89] found id: ""
	I1217 20:33:13.574873  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.574880  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:13.574885  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:13.574945  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:13.602159  420062 cri.go:89] found id: ""
	I1217 20:33:13.602173  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.602180  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:13.602185  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:13.602244  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:13.625735  420062 cri.go:89] found id: ""
	I1217 20:33:13.625748  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.625755  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:13.625760  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:13.625816  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:13.650446  420062 cri.go:89] found id: ""
	I1217 20:33:13.650460  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.650468  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:13.650473  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:13.650533  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:13.677915  420062 cri.go:89] found id: ""
	I1217 20:33:13.677929  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.677936  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:13.677944  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:13.677954  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:13.692434  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:13.692449  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:13.767790  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:13.758832   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.759470   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.761607   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.762393   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.763960   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:13.758832   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.759470   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.761607   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.762393   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.763960   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:13.767810  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:13.767820  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:13.839665  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:13.839685  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:13.872573  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:13.872589  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:16.429115  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:16.438989  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:16.439051  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:16.466518  420062 cri.go:89] found id: ""
	I1217 20:33:16.466532  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.466539  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:16.466545  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:16.466602  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:16.492200  420062 cri.go:89] found id: ""
	I1217 20:33:16.492213  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.492221  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:16.492226  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:16.492302  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:16.517055  420062 cri.go:89] found id: ""
	I1217 20:33:16.517070  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.517083  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:16.517088  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:16.517148  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:16.552138  420062 cri.go:89] found id: ""
	I1217 20:33:16.552152  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.552159  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:16.552165  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:16.552235  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:16.577184  420062 cri.go:89] found id: ""
	I1217 20:33:16.577198  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.577214  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:16.577220  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:16.577279  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:16.602039  420062 cri.go:89] found id: ""
	I1217 20:33:16.602053  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.602060  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:16.602066  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:16.602124  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:16.626732  420062 cri.go:89] found id: ""
	I1217 20:33:16.626745  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.626752  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:16.626760  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:16.626770  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:16.689454  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:16.689473  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:16.722345  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:16.722363  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:16.784686  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:16.784705  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:16.801895  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:16.801911  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:16.865697  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:16.856899   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.857554   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859279   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859924   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.861707   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:16.856899   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.857554   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859279   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859924   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.861707   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:19.365915  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:19.375998  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:19.376066  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:19.399955  420062 cri.go:89] found id: ""
	I1217 20:33:19.399968  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.399976  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:19.399981  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:19.400039  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:19.424668  420062 cri.go:89] found id: ""
	I1217 20:33:19.424682  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.424689  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:19.424695  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:19.424755  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:19.449865  420062 cri.go:89] found id: ""
	I1217 20:33:19.449879  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.449886  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:19.449891  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:19.449958  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:19.474803  420062 cri.go:89] found id: ""
	I1217 20:33:19.474816  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.474833  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:19.474838  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:19.474909  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:19.503551  420062 cri.go:89] found id: ""
	I1217 20:33:19.503579  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.503598  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:19.503603  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:19.503687  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:19.529232  420062 cri.go:89] found id: ""
	I1217 20:33:19.529246  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.529259  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:19.529264  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:19.529330  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:19.554443  420062 cri.go:89] found id: ""
	I1217 20:33:19.554456  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.554463  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:19.554481  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:19.554491  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:19.609391  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:19.609411  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:19.625653  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:19.625669  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:19.691445  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:19.683737   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.684184   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.685768   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.686180   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.687608   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:19.683737   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.684184   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.685768   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.686180   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.687608   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:19.691456  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:19.691466  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:19.754663  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:19.754682  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:22.297725  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:22.309139  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:22.309199  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:22.334369  420062 cri.go:89] found id: ""
	I1217 20:33:22.334382  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.334390  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:22.334395  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:22.334458  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:22.363418  420062 cri.go:89] found id: ""
	I1217 20:33:22.363445  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.363453  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:22.363458  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:22.363531  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:22.388924  420062 cri.go:89] found id: ""
	I1217 20:33:22.388939  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.388947  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:22.388993  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:22.389056  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:22.415757  420062 cri.go:89] found id: ""
	I1217 20:33:22.415780  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.415787  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:22.415793  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:22.415872  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:22.441520  420062 cri.go:89] found id: ""
	I1217 20:33:22.441534  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.441541  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:22.441546  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:22.441605  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:22.480775  420062 cri.go:89] found id: ""
	I1217 20:33:22.480789  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.480795  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:22.480801  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:22.480873  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:22.505556  420062 cri.go:89] found id: ""
	I1217 20:33:22.505570  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.505577  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:22.505585  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:22.505596  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:22.562036  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:22.562054  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:22.577369  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:22.577386  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:22.647423  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:22.638838   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.639486   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641272   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641956   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.643602   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:22.638838   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.639486   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641272   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641956   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.643602   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:22.647453  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:22.647464  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:22.710153  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:22.710173  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:25.239783  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:25.250945  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:25.251006  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:25.277422  420062 cri.go:89] found id: ""
	I1217 20:33:25.277435  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.277443  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:25.277448  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:25.277510  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:25.303032  420062 cri.go:89] found id: ""
	I1217 20:33:25.303051  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.303063  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:25.303070  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:25.303176  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:25.333183  420062 cri.go:89] found id: ""
	I1217 20:33:25.333197  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.333204  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:25.333209  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:25.333272  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:25.358899  420062 cri.go:89] found id: ""
	I1217 20:33:25.358913  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.358920  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:25.358926  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:25.358986  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:25.388611  420062 cri.go:89] found id: ""
	I1217 20:33:25.388625  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.388633  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:25.388638  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:25.388704  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:25.415829  420062 cri.go:89] found id: ""
	I1217 20:33:25.415844  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.415852  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:25.415857  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:25.415913  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:25.442921  420062 cri.go:89] found id: ""
	I1217 20:33:25.442935  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.442941  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:25.442949  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:25.442965  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:25.459113  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:25.459135  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:25.535629  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:25.526636   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.527172   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.528838   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.529443   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.530989   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:25.526636   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.527172   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.528838   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.529443   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.530989   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:25.535645  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:25.535655  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:25.601950  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:25.601968  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:25.634192  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:25.634208  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:28.190569  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:28.200504  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:28.200563  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:28.224311  420062 cri.go:89] found id: ""
	I1217 20:33:28.224325  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.224332  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:28.224338  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:28.224396  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:28.252603  420062 cri.go:89] found id: ""
	I1217 20:33:28.252622  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.252629  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:28.252634  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:28.252692  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:28.276684  420062 cri.go:89] found id: ""
	I1217 20:33:28.276697  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.276704  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:28.276709  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:28.276777  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:28.299922  420062 cri.go:89] found id: ""
	I1217 20:33:28.299935  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.299942  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:28.299947  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:28.300014  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:28.326124  420062 cri.go:89] found id: ""
	I1217 20:33:28.326137  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.326144  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:28.326150  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:28.326218  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:28.349497  420062 cri.go:89] found id: ""
	I1217 20:33:28.349510  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.349517  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:28.349523  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:28.349579  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:28.378156  420062 cri.go:89] found id: ""
	I1217 20:33:28.378170  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.378177  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:28.378185  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:28.378194  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:28.434254  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:28.434274  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:28.448810  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:28.448837  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:28.521268  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:28.512905   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.513656   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515366   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515890   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.517404   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:28.512905   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.513656   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515366   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515890   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.517404   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:28.521279  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:28.521290  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:28.584201  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:28.584222  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:31.112699  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:31.123315  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:31.123377  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:31.151761  420062 cri.go:89] found id: ""
	I1217 20:33:31.151776  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.151783  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:31.151789  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:31.151849  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:31.177165  420062 cri.go:89] found id: ""
	I1217 20:33:31.177178  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.177186  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:31.177191  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:31.177262  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:31.205229  420062 cri.go:89] found id: ""
	I1217 20:33:31.205260  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.205267  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:31.205272  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:31.205341  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:31.229570  420062 cri.go:89] found id: ""
	I1217 20:33:31.229584  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.229591  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:31.229597  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:31.229673  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:31.258880  420062 cri.go:89] found id: ""
	I1217 20:33:31.258904  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.258911  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:31.258917  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:31.258983  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:31.286222  420062 cri.go:89] found id: ""
	I1217 20:33:31.286241  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.286248  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:31.286253  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:31.286315  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:31.311291  420062 cri.go:89] found id: ""
	I1217 20:33:31.311314  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.311322  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:31.311330  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:31.311340  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:31.342524  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:31.342541  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:31.398421  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:31.398440  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:31.413476  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:31.413497  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:31.478376  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:31.469734   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.470537   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472118   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472657   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.474358   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:31.469734   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.470537   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472118   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472657   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.474358   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:31.478388  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:31.478398  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:34.044394  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:34.054571  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:34.054632  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:34.078791  420062 cri.go:89] found id: ""
	I1217 20:33:34.078815  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.078822  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:34.078827  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:34.078902  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:34.103484  420062 cri.go:89] found id: ""
	I1217 20:33:34.103498  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.103505  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:34.103510  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:34.103578  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:34.128330  420062 cri.go:89] found id: ""
	I1217 20:33:34.128343  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.128362  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:34.128368  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:34.128436  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:34.156115  420062 cri.go:89] found id: ""
	I1217 20:33:34.156129  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.156136  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:34.156141  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:34.156208  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:34.179862  420062 cri.go:89] found id: ""
	I1217 20:33:34.179876  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.179884  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:34.179889  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:34.179959  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:34.205717  420062 cri.go:89] found id: ""
	I1217 20:33:34.205731  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.205739  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:34.205745  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:34.205804  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:34.230674  420062 cri.go:89] found id: ""
	I1217 20:33:34.230689  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.230702  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:34.230710  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:34.230720  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:34.286930  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:34.286949  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:34.301786  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:34.301803  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:34.365439  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:34.357724   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.358190   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.359660   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.360034   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.361429   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:34.357724   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.358190   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.359660   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.360034   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.361429   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:34.365461  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:34.365473  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:34.426703  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:34.426724  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:36.954941  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:36.964889  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:36.964949  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:37.000981  420062 cri.go:89] found id: ""
	I1217 20:33:37.000999  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.001008  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:37.001014  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:37.001098  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:37.036987  420062 cri.go:89] found id: ""
	I1217 20:33:37.037001  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.037008  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:37.037013  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:37.037083  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:37.067078  420062 cri.go:89] found id: ""
	I1217 20:33:37.067092  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.067099  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:37.067105  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:37.067173  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:37.101494  420062 cri.go:89] found id: ""
	I1217 20:33:37.101509  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.101516  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:37.101522  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:37.101582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:37.125577  420062 cri.go:89] found id: ""
	I1217 20:33:37.125591  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.125599  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:37.125604  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:37.125672  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:37.155006  420062 cri.go:89] found id: ""
	I1217 20:33:37.155022  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.155040  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:37.155045  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:37.155105  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:37.180061  420062 cri.go:89] found id: ""
	I1217 20:33:37.180075  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.180082  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:37.180090  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:37.180110  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:37.235716  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:37.235744  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:37.250676  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:37.250704  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:37.314789  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:37.307219   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.307729   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309210   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309555   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.311019   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:37.307219   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.307729   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309210   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309555   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.311019   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:37.314799  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:37.314811  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:37.376546  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:37.376566  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:39.904036  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:39.914146  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:39.914209  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:39.942353  420062 cri.go:89] found id: ""
	I1217 20:33:39.942366  420062 logs.go:282] 0 containers: []
	W1217 20:33:39.942374  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:39.942379  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:39.942445  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:39.970090  420062 cri.go:89] found id: ""
	I1217 20:33:39.970105  420062 logs.go:282] 0 containers: []
	W1217 20:33:39.970113  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:39.970119  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:39.970185  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:40.013204  420062 cri.go:89] found id: ""
	I1217 20:33:40.013220  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.013228  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:40.013234  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:40.013312  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:40.055438  420062 cri.go:89] found id: ""
	I1217 20:33:40.055453  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.055461  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:40.055467  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:40.055532  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:40.088240  420062 cri.go:89] found id: ""
	I1217 20:33:40.088285  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.088293  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:40.088298  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:40.088361  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:40.116666  420062 cri.go:89] found id: ""
	I1217 20:33:40.116680  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.116687  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:40.116693  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:40.116752  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:40.143935  420062 cri.go:89] found id: ""
	I1217 20:33:40.143951  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.143965  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:40.143973  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:40.143986  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:40.199464  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:40.199484  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:40.214665  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:40.214682  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:40.285603  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:40.277391   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.277927   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.279526   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.280079   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.281668   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:40.277391   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.277927   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.279526   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.280079   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.281668   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:40.285613  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:40.285623  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:40.348551  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:40.348571  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:42.882366  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:42.892346  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:42.892407  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:42.917526  420062 cri.go:89] found id: ""
	I1217 20:33:42.917540  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.917548  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:42.917553  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:42.917622  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:42.941649  420062 cri.go:89] found id: ""
	I1217 20:33:42.941663  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.941670  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:42.941675  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:42.941737  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:42.965314  420062 cri.go:89] found id: ""
	I1217 20:33:42.965328  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.965335  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:42.965341  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:42.965399  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:42.992861  420062 cri.go:89] found id: ""
	I1217 20:33:42.992875  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.992882  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:42.992888  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:42.992949  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:43.026962  420062 cri.go:89] found id: ""
	I1217 20:33:43.026977  420062 logs.go:282] 0 containers: []
	W1217 20:33:43.026984  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:43.026989  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:43.027048  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:43.056268  420062 cri.go:89] found id: ""
	I1217 20:33:43.056282  420062 logs.go:282] 0 containers: []
	W1217 20:33:43.056289  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:43.056295  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:43.056353  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:43.088527  420062 cri.go:89] found id: ""
	I1217 20:33:43.088542  420062 logs.go:282] 0 containers: []
	W1217 20:33:43.088549  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:43.088556  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:43.088567  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:43.115028  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:43.115044  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:43.170239  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:43.170258  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:43.185453  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:43.185468  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:43.255155  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:43.247293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.247760   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249636   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.251132   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:43.247293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.247760   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249636   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.251132   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:43.255166  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:43.255176  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:45.818750  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:45.829020  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:45.829084  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:45.854296  420062 cri.go:89] found id: ""
	I1217 20:33:45.854310  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.854319  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:45.854327  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:45.854393  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:45.884706  420062 cri.go:89] found id: ""
	I1217 20:33:45.884720  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.884728  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:45.884733  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:45.884795  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:45.909518  420062 cri.go:89] found id: ""
	I1217 20:33:45.909533  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.909540  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:45.909545  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:45.909615  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:45.935050  420062 cri.go:89] found id: ""
	I1217 20:33:45.935065  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.935073  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:45.935078  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:45.935155  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:45.964622  420062 cri.go:89] found id: ""
	I1217 20:33:45.964636  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.964643  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:45.964648  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:45.964714  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:45.992340  420062 cri.go:89] found id: ""
	I1217 20:33:45.992355  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.992363  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:45.992368  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:45.992432  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:46.029800  420062 cri.go:89] found id: ""
	I1217 20:33:46.029815  420062 logs.go:282] 0 containers: []
	W1217 20:33:46.029822  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:46.029841  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:46.029852  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:46.096203  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:46.096224  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:46.111499  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:46.111517  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:46.174259  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:46.165992   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.166754   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168484   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168848   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.170379   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:46.165992   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.166754   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168484   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168848   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.170379   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:46.174269  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:46.174282  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:46.239891  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:46.239911  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:48.769726  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:48.779731  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:48.779796  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:48.803697  420062 cri.go:89] found id: ""
	I1217 20:33:48.803710  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.803718  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:48.803723  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:48.803790  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:48.828947  420062 cri.go:89] found id: ""
	I1217 20:33:48.828966  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.828974  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:48.828979  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:48.829045  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:48.853794  420062 cri.go:89] found id: ""
	I1217 20:33:48.853809  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.853815  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:48.853821  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:48.853884  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:48.879220  420062 cri.go:89] found id: ""
	I1217 20:33:48.879234  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.879241  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:48.879253  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:48.879316  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:48.905546  420062 cri.go:89] found id: ""
	I1217 20:33:48.905560  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.905567  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:48.905573  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:48.905639  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:48.931025  420062 cri.go:89] found id: ""
	I1217 20:33:48.931040  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.931047  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:48.931053  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:48.931111  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:48.959554  420062 cri.go:89] found id: ""
	I1217 20:33:48.959567  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.959575  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:48.959591  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:48.959603  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:49.037548  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:49.028333   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.029097   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.030655   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.031218   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.033613   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:49.028333   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.029097   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.030655   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.031218   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.033613   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:49.037558  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:49.037576  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:49.104606  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:49.104628  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:49.132120  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:49.132142  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:49.189781  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:49.189799  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:51.705313  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:51.715310  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:51.715375  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:51.742788  420062 cri.go:89] found id: ""
	I1217 20:33:51.742803  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.742810  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:51.742816  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:51.742878  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:51.768132  420062 cri.go:89] found id: ""
	I1217 20:33:51.768147  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.768154  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:51.768160  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:51.768220  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:51.796803  420062 cri.go:89] found id: ""
	I1217 20:33:51.796817  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.796825  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:51.796831  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:51.796891  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:51.823032  420062 cri.go:89] found id: ""
	I1217 20:33:51.823046  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.823054  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:51.823061  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:51.823122  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:51.848750  420062 cri.go:89] found id: ""
	I1217 20:33:51.848765  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.848773  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:51.848778  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:51.848840  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:51.874494  420062 cri.go:89] found id: ""
	I1217 20:33:51.874509  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.874516  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:51.874522  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:51.874582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:51.912240  420062 cri.go:89] found id: ""
	I1217 20:33:51.912273  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.912281  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:51.912290  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:51.912301  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:51.940881  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:51.940897  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:51.997574  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:51.997596  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:52.016000  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:52.016018  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:52.093264  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:52.084701   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.085399   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087055   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087666   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.089311   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:52.084701   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.085399   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087055   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087666   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.089311   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:52.093274  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:52.093286  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:54.657449  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:54.667679  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:54.667741  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:54.696106  420062 cri.go:89] found id: ""
	I1217 20:33:54.696121  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.696128  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:54.696133  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:54.696194  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:54.720578  420062 cri.go:89] found id: ""
	I1217 20:33:54.720592  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.720599  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:54.720605  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:54.720669  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:54.746036  420062 cri.go:89] found id: ""
	I1217 20:33:54.746050  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.746058  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:54.746063  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:54.746122  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:54.770192  420062 cri.go:89] found id: ""
	I1217 20:33:54.770206  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.770213  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:54.770219  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:54.770275  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:54.794365  420062 cri.go:89] found id: ""
	I1217 20:33:54.794379  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.794386  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:54.794391  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:54.794454  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:54.818424  420062 cri.go:89] found id: ""
	I1217 20:33:54.818438  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.818446  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:54.818451  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:54.818513  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:54.843360  420062 cri.go:89] found id: ""
	I1217 20:33:54.843375  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.843382  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:54.843401  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:54.843412  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:54.872684  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:54.872701  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:54.928831  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:54.928851  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:54.943545  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:54.943561  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:55.020697  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:55.008146   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.009058   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.010012   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011180   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011994   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:55.008146   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.009058   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.010012   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011180   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011994   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:55.020721  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:55.020734  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:57.590507  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:57.600840  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:57.600911  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:57.628650  420062 cri.go:89] found id: ""
	I1217 20:33:57.628664  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.628671  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:57.628676  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:57.628736  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:57.653915  420062 cri.go:89] found id: ""
	I1217 20:33:57.653929  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.653936  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:57.653941  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:57.654005  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:57.677881  420062 cri.go:89] found id: ""
	I1217 20:33:57.677894  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.677901  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:57.677906  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:57.677974  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:57.701808  420062 cri.go:89] found id: ""
	I1217 20:33:57.701823  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.701830  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:57.701836  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:57.701894  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:57.725682  420062 cri.go:89] found id: ""
	I1217 20:33:57.725696  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.725703  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:57.725708  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:57.725770  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:57.753864  420062 cri.go:89] found id: ""
	I1217 20:33:57.753878  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.753885  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:57.753891  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:57.753948  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:57.779180  420062 cri.go:89] found id: ""
	I1217 20:33:57.779193  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.779200  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:57.779216  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:57.779227  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:57.834554  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:57.834575  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:57.849468  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:57.849484  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:57.917796  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:57.910011   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.910781   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912353   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912882   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.913951   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:57.910011   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.910781   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912353   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912882   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.913951   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:57.917816  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:57.917827  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:57.980535  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:57.980556  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:00.519246  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:00.531028  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:00.531090  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:00.557919  420062 cri.go:89] found id: ""
	I1217 20:34:00.557933  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.557941  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:00.557947  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:00.558006  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:00.583357  420062 cri.go:89] found id: ""
	I1217 20:34:00.583381  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.583389  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:00.583394  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:00.583461  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:00.608300  420062 cri.go:89] found id: ""
	I1217 20:34:00.608313  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.608321  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:00.608326  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:00.608396  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:00.633249  420062 cri.go:89] found id: ""
	I1217 20:34:00.633263  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.633271  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:00.633277  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:00.633354  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:00.657998  420062 cri.go:89] found id: ""
	I1217 20:34:00.658012  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.658020  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:00.658025  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:00.658083  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:00.686479  420062 cri.go:89] found id: ""
	I1217 20:34:00.686494  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.686502  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:00.686517  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:00.686600  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:00.715237  420062 cri.go:89] found id: ""
	I1217 20:34:00.715251  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.715259  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:00.715281  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:00.715297  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:00.771736  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:00.771756  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:00.786569  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:00.786584  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:00.855532  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:00.846820   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.847617   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849290   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849821   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.851435   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:00.846820   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.847617   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849290   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849821   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.851435   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:00.855544  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:00.855556  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:00.929889  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:00.929917  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:03.457778  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:03.467767  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:03.467830  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:03.491745  420062 cri.go:89] found id: ""
	I1217 20:34:03.491760  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.491767  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:03.491772  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:03.491834  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:03.516486  420062 cri.go:89] found id: ""
	I1217 20:34:03.516501  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.516508  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:03.516514  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:03.516573  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:03.545504  420062 cri.go:89] found id: ""
	I1217 20:34:03.545518  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.545526  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:03.545531  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:03.545592  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:03.570752  420062 cri.go:89] found id: ""
	I1217 20:34:03.570766  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.570773  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:03.570779  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:03.570837  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:03.599464  420062 cri.go:89] found id: ""
	I1217 20:34:03.599478  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.599486  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:03.599491  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:03.599551  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:03.626193  420062 cri.go:89] found id: ""
	I1217 20:34:03.626209  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.626217  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:03.626222  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:03.626280  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:03.650682  420062 cri.go:89] found id: ""
	I1217 20:34:03.650696  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.650704  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:03.650712  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:03.650724  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:03.712614  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:03.705244   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.705869   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.706805   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.707331   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.708827   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:03.705244   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.705869   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.706805   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.707331   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.708827   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:03.712625  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:03.712636  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:03.775226  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:03.775247  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:03.801581  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:03.801600  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:03.857991  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:03.858013  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:06.373018  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:06.382912  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:06.382972  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:06.408596  420062 cri.go:89] found id: ""
	I1217 20:34:06.408610  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.408617  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:06.408622  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:06.408681  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:06.437062  420062 cri.go:89] found id: ""
	I1217 20:34:06.437076  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.437083  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:06.437088  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:06.437149  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:06.463109  420062 cri.go:89] found id: ""
	I1217 20:34:06.463123  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.463130  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:06.463135  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:06.463198  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:06.487450  420062 cri.go:89] found id: ""
	I1217 20:34:06.487463  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.487470  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:06.487476  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:06.487537  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:06.512848  420062 cri.go:89] found id: ""
	I1217 20:34:06.512863  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.512870  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:06.512876  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:06.512939  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:06.536984  420062 cri.go:89] found id: ""
	I1217 20:34:06.536998  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.537006  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:06.537011  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:06.537069  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:06.565689  420062 cri.go:89] found id: ""
	I1217 20:34:06.565732  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.565740  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:06.565748  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:06.565758  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:06.626274  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:06.626294  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:06.641612  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:06.641630  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:06.703082  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:06.694717   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.695365   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697091   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697739   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.699357   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:06.694717   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.695365   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697091   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697739   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.699357   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:06.703092  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:06.703104  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:06.768202  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:06.768221  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:09.296397  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:09.306558  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:09.306619  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:09.330814  420062 cri.go:89] found id: ""
	I1217 20:34:09.330828  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.330836  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:09.330841  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:09.330900  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:09.360228  420062 cri.go:89] found id: ""
	I1217 20:34:09.360242  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.360270  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:09.360276  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:09.360336  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:09.383852  420062 cri.go:89] found id: ""
	I1217 20:34:09.383865  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.383871  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:09.383876  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:09.383933  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:09.408740  420062 cri.go:89] found id: ""
	I1217 20:34:09.408753  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.408760  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:09.408765  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:09.408824  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:09.433879  420062 cri.go:89] found id: ""
	I1217 20:34:09.433894  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.433901  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:09.433907  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:09.433965  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:09.458138  420062 cri.go:89] found id: ""
	I1217 20:34:09.458152  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.458160  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:09.458165  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:09.458223  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:09.482170  420062 cri.go:89] found id: ""
	I1217 20:34:09.482184  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.482191  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:09.482199  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:09.482214  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:09.539809  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:09.539831  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:09.555108  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:09.555124  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:09.617755  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:09.608834   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.609449   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611182   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611721   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.613344   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:09.608834   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.609449   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611182   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611721   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.613344   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:09.617779  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:09.617790  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:09.680900  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:09.680920  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:12.217262  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:12.227378  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:12.227441  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:12.260904  420062 cri.go:89] found id: ""
	I1217 20:34:12.260918  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.260926  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:12.260931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:12.260991  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:12.290600  420062 cri.go:89] found id: ""
	I1217 20:34:12.290614  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.290621  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:12.290626  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:12.290694  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:12.317694  420062 cri.go:89] found id: ""
	I1217 20:34:12.317708  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.317716  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:12.317721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:12.317789  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:12.347280  420062 cri.go:89] found id: ""
	I1217 20:34:12.347300  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.347308  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:12.347323  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:12.347382  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:12.375032  420062 cri.go:89] found id: ""
	I1217 20:34:12.375046  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.375054  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:12.375060  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:12.375121  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:12.400749  420062 cri.go:89] found id: ""
	I1217 20:34:12.400763  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.400771  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:12.400779  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:12.400837  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:12.425915  420062 cri.go:89] found id: ""
	I1217 20:34:12.425929  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.425937  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:12.425946  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:12.425957  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:12.486250  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:12.486269  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:12.501500  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:12.501515  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:12.571896  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:12.563218   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564136   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564679   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566300   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566801   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:12.563218   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564136   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564679   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566300   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566801   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:12.571906  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:12.571921  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:12.635853  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:12.635876  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:15.166604  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:15.177581  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:15.177645  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:15.201800  420062 cri.go:89] found id: ""
	I1217 20:34:15.201815  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.201822  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:15.201828  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:15.201892  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:15.229609  420062 cri.go:89] found id: ""
	I1217 20:34:15.229624  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.229631  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:15.229636  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:15.229703  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:15.257583  420062 cri.go:89] found id: ""
	I1217 20:34:15.257597  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.257605  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:15.257610  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:15.257673  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:15.291085  420062 cri.go:89] found id: ""
	I1217 20:34:15.291099  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.291106  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:15.291112  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:15.291190  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:15.324198  420062 cri.go:89] found id: ""
	I1217 20:34:15.324212  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.324219  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:15.324226  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:15.324317  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:15.348977  420062 cri.go:89] found id: ""
	I1217 20:34:15.348991  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.348998  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:15.349004  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:15.349069  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:15.373132  420062 cri.go:89] found id: ""
	I1217 20:34:15.373147  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.373155  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:15.373162  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:15.373174  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:15.387711  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:15.387728  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:15.453164  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:15.443181   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.443915   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.445657   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.447470   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.448047   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:15.443181   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.443915   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.445657   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.447470   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.448047   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:15.453175  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:15.453187  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:15.519197  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:15.519219  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:15.547781  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:15.547799  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:18.106475  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:18.117557  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:18.117619  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:18.142233  420062 cri.go:89] found id: ""
	I1217 20:34:18.142246  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.142253  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:18.142258  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:18.142319  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:18.166913  420062 cri.go:89] found id: ""
	I1217 20:34:18.166927  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.166934  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:18.166940  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:18.167002  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:18.195856  420062 cri.go:89] found id: ""
	I1217 20:34:18.195870  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.195877  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:18.195883  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:18.195944  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:18.222291  420062 cri.go:89] found id: ""
	I1217 20:34:18.222306  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.222313  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:18.222318  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:18.222382  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:18.254911  420062 cri.go:89] found id: ""
	I1217 20:34:18.254925  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.254932  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:18.254937  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:18.254996  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:18.299082  420062 cri.go:89] found id: ""
	I1217 20:34:18.299096  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.299103  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:18.299109  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:18.299173  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:18.323848  420062 cri.go:89] found id: ""
	I1217 20:34:18.323862  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.323869  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:18.323877  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:18.323888  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:18.381056  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:18.381082  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:18.395602  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:18.395617  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:18.459223  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:18.450909   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.451543   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453107   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453711   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.455276   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:18.450909   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.451543   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453107   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453711   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.455276   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:18.459233  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:18.459244  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:18.522287  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:18.522307  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:21.051832  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:21.062206  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:21.062275  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:21.090124  420062 cri.go:89] found id: ""
	I1217 20:34:21.090139  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.090146  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:21.090151  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:21.090211  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:21.114268  420062 cri.go:89] found id: ""
	I1217 20:34:21.114282  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.114289  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:21.114294  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:21.114357  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:21.141585  420062 cri.go:89] found id: ""
	I1217 20:34:21.141599  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.141606  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:21.141611  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:21.141673  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:21.167173  420062 cri.go:89] found id: ""
	I1217 20:34:21.167187  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.167195  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:21.167200  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:21.167277  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:21.191543  420062 cri.go:89] found id: ""
	I1217 20:34:21.191557  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.191564  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:21.191569  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:21.191640  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:21.219365  420062 cri.go:89] found id: ""
	I1217 20:34:21.219378  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.219385  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:21.219390  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:21.219451  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:21.256303  420062 cri.go:89] found id: ""
	I1217 20:34:21.256317  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.256324  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:21.256332  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:21.256342  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:21.323014  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:21.323035  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:21.337647  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:21.337664  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:21.400131  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:21.391524   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.392409   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394006   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394305   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.395921   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:21.391524   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.392409   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394006   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394305   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.395921   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:21.400140  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:21.400151  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:21.467704  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:21.467725  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:23.996278  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:24.008421  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:24.008487  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:24.035322  420062 cri.go:89] found id: ""
	I1217 20:34:24.035336  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.035344  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:24.035349  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:24.035413  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:24.060026  420062 cri.go:89] found id: ""
	I1217 20:34:24.060040  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.060048  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:24.060054  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:24.060131  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:24.085236  420062 cri.go:89] found id: ""
	I1217 20:34:24.085250  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.085257  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:24.085263  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:24.085323  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:24.110730  420062 cri.go:89] found id: ""
	I1217 20:34:24.110763  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.110772  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:24.110778  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:24.110851  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:24.138006  420062 cri.go:89] found id: ""
	I1217 20:34:24.138020  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.138028  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:24.138034  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:24.138094  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:24.168065  420062 cri.go:89] found id: ""
	I1217 20:34:24.168080  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.168094  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:24.168100  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:24.168172  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:24.193244  420062 cri.go:89] found id: ""
	I1217 20:34:24.193258  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.193265  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:24.193273  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:24.193284  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:24.260181  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:24.260201  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:24.299429  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:24.299446  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:24.355633  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:24.355653  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:24.371493  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:24.371508  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:24.439767  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:24.431274   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.432160   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.433728   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.434387   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.435768   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:24.431274   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.432160   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.433728   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.434387   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.435768   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:26.940651  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:26.951081  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:26.951148  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:26.975583  420062 cri.go:89] found id: ""
	I1217 20:34:26.975598  420062 logs.go:282] 0 containers: []
	W1217 20:34:26.975606  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:26.975611  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:26.975671  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:27.003924  420062 cri.go:89] found id: ""
	I1217 20:34:27.003939  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.003948  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:27.003954  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:27.004018  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:27.029433  420062 cri.go:89] found id: ""
	I1217 20:34:27.029446  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.029454  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:27.029460  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:27.029520  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:27.055977  420062 cri.go:89] found id: ""
	I1217 20:34:27.055990  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.055998  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:27.056027  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:27.056093  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:27.081756  420062 cri.go:89] found id: ""
	I1217 20:34:27.081770  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.081777  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:27.081783  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:27.081846  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:27.106532  420062 cri.go:89] found id: ""
	I1217 20:34:27.106546  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.106554  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:27.106587  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:27.106651  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:27.131573  420062 cri.go:89] found id: ""
	I1217 20:34:27.131587  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.131595  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:27.131603  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:27.131613  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:27.194270  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:27.194290  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:27.222438  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:27.222453  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:27.284134  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:27.284154  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:27.300336  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:27.300352  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:27.369337  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:27.360889   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.361563   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363199   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363762   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.365407   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:27.360889   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.361563   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363199   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363762   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.365407   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:29.871004  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:29.881325  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:29.881389  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:29.906739  420062 cri.go:89] found id: ""
	I1217 20:34:29.906753  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.906760  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:29.906766  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:29.906828  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:29.935023  420062 cri.go:89] found id: ""
	I1217 20:34:29.935037  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.935045  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:29.935049  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:29.935110  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:29.968427  420062 cri.go:89] found id: ""
	I1217 20:34:29.968442  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.968449  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:29.968454  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:29.968514  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:29.993120  420062 cri.go:89] found id: ""
	I1217 20:34:29.993133  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.993141  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:29.993147  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:29.993208  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:30.038216  420062 cri.go:89] found id: ""
	I1217 20:34:30.038232  420062 logs.go:282] 0 containers: []
	W1217 20:34:30.038240  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:30.038256  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:30.038331  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:30.088044  420062 cri.go:89] found id: ""
	I1217 20:34:30.088059  420062 logs.go:282] 0 containers: []
	W1217 20:34:30.088067  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:30.088080  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:30.088145  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:30.116773  420062 cri.go:89] found id: ""
	I1217 20:34:30.116789  420062 logs.go:282] 0 containers: []
	W1217 20:34:30.116798  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:30.116808  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:30.116819  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:30.175618  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:30.175638  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:30.191950  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:30.191967  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:30.268938  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:30.259892   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.260676   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262229   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262537   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.263971   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:30.259892   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.260676   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262229   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262537   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.263971   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:30.268949  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:30.268960  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:30.345609  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:30.345631  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:32.873852  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:32.884009  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:32.884072  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:32.908673  420062 cri.go:89] found id: ""
	I1217 20:34:32.908688  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.908696  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:32.908701  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:32.908761  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:32.933101  420062 cri.go:89] found id: ""
	I1217 20:34:32.933115  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.933122  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:32.933127  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:32.933192  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:32.956968  420062 cri.go:89] found id: ""
	I1217 20:34:32.956982  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.956991  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:32.956996  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:32.957054  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:32.982228  420062 cri.go:89] found id: ""
	I1217 20:34:32.982241  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.982249  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:32.982254  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:32.982312  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:33.011791  420062 cri.go:89] found id: ""
	I1217 20:34:33.011805  420062 logs.go:282] 0 containers: []
	W1217 20:34:33.011812  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:33.011818  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:33.011885  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:33.038878  420062 cri.go:89] found id: ""
	I1217 20:34:33.038894  420062 logs.go:282] 0 containers: []
	W1217 20:34:33.038901  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:33.038907  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:33.038969  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:33.068421  420062 cri.go:89] found id: ""
	I1217 20:34:33.068436  420062 logs.go:282] 0 containers: []
	W1217 20:34:33.068443  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:33.068453  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:33.068463  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:33.083444  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:33.083461  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:33.147593  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:33.139067   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.139640   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141533   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141989   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.143520   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:33.139067   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.139640   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141533   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141989   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.143520   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:33.147604  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:33.147617  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:33.211005  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:33.211025  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:33.247311  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:33.247327  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:35.820692  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:35.830805  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:35.830879  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:35.855694  420062 cri.go:89] found id: ""
	I1217 20:34:35.855708  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.855716  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:35.855721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:35.855780  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:35.879932  420062 cri.go:89] found id: ""
	I1217 20:34:35.879947  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.879955  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:35.879960  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:35.880021  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:35.904606  420062 cri.go:89] found id: ""
	I1217 20:34:35.904622  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.904630  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:35.904635  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:35.904700  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:35.932655  420062 cri.go:89] found id: ""
	I1217 20:34:35.932669  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.932676  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:35.932681  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:35.932742  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:35.956665  420062 cri.go:89] found id: ""
	I1217 20:34:35.956679  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.956686  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:35.956691  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:35.956748  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:35.981363  420062 cri.go:89] found id: ""
	I1217 20:34:35.981377  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.981385  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:35.981391  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:35.981450  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:36.013052  420062 cri.go:89] found id: ""
	I1217 20:34:36.013068  420062 logs.go:282] 0 containers: []
	W1217 20:34:36.013076  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:36.013084  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:36.013097  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:36.080346  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:36.080367  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:36.109280  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:36.109296  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:36.168612  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:36.168630  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:36.183490  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:36.183505  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:36.254206  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:36.245334   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.246226   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.247937   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.248300   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.249802   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:36.245334   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.246226   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.247937   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.248300   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.249802   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:38.754461  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:38.764820  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:38.764885  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:38.790226  420062 cri.go:89] found id: ""
	I1217 20:34:38.790243  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.790251  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:38.790257  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:38.790317  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:38.815898  420062 cri.go:89] found id: ""
	I1217 20:34:38.815913  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.815920  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:38.815925  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:38.815986  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:38.840879  420062 cri.go:89] found id: ""
	I1217 20:34:38.840894  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.840901  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:38.840907  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:38.840967  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:38.865756  420062 cri.go:89] found id: ""
	I1217 20:34:38.865772  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.865780  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:38.865785  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:38.865851  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:38.893497  420062 cri.go:89] found id: ""
	I1217 20:34:38.893511  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.893518  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:38.893523  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:38.893582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:38.918737  420062 cri.go:89] found id: ""
	I1217 20:34:38.918751  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.918758  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:38.918763  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:38.918821  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:38.943126  420062 cri.go:89] found id: ""
	I1217 20:34:38.943140  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.943147  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:38.943155  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:38.943166  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:39.008933  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:38.999020   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.000025   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.001953   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.002737   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.004715   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:38.999020   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.000025   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.001953   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.002737   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.004715   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:39.008944  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:39.008955  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:39.071529  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:39.071550  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:39.098851  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:39.098866  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:39.157559  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:39.157578  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:41.673292  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:41.683569  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:41.683631  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:41.712444  420062 cri.go:89] found id: ""
	I1217 20:34:41.712458  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.712466  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:41.712471  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:41.712540  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:41.737230  420062 cri.go:89] found id: ""
	I1217 20:34:41.737244  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.737253  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:41.737258  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:41.737320  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:41.765904  420062 cri.go:89] found id: ""
	I1217 20:34:41.765918  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.765926  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:41.765931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:41.765993  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:41.790803  420062 cri.go:89] found id: ""
	I1217 20:34:41.790818  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.790826  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:41.790831  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:41.790891  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:41.816378  420062 cri.go:89] found id: ""
	I1217 20:34:41.816393  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.816399  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:41.816405  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:41.816465  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:41.846163  420062 cri.go:89] found id: ""
	I1217 20:34:41.846177  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.846184  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:41.846190  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:41.846249  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:41.874235  420062 cri.go:89] found id: ""
	I1217 20:34:41.874249  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.874257  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:41.874264  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:41.874278  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:41.930007  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:41.930025  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:41.944733  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:41.944748  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:42.015145  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:42.005958   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.007326   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.008416   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009480   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009948   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:42.005958   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.007326   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.008416   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009480   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009948   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:42.015157  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:42.015168  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:42.083018  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:42.083046  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:44.617783  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:44.627898  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:44.627959  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:44.654510  420062 cri.go:89] found id: ""
	I1217 20:34:44.654524  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.654531  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:44.654536  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:44.654600  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:44.681532  420062 cri.go:89] found id: ""
	I1217 20:34:44.681547  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.681554  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:44.681560  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:44.681620  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:44.705927  420062 cri.go:89] found id: ""
	I1217 20:34:44.705941  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.705948  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:44.705953  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:44.706010  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:44.730835  420062 cri.go:89] found id: ""
	I1217 20:34:44.730849  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.730857  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:44.730862  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:44.730925  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:44.754987  420062 cri.go:89] found id: ""
	I1217 20:34:44.755002  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.755009  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:44.755014  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:44.755074  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:44.778787  420062 cri.go:89] found id: ""
	I1217 20:34:44.778801  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.778808  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:44.778814  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:44.778874  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:44.804370  420062 cri.go:89] found id: ""
	I1217 20:34:44.804385  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.804392  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:44.804401  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:44.804411  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:44.870852  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:44.870872  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:44.901529  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:44.901545  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:44.961405  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:44.961428  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:44.976411  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:44.976427  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:45.055180  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:45.045055   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.046486   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.047127   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.048790   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.049451   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:45.045055   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.046486   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.047127   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.048790   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.049451   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:47.555437  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:47.565320  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:47.565380  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:47.594473  420062 cri.go:89] found id: ""
	I1217 20:34:47.594488  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.594495  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:47.594500  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:47.594560  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:47.618819  420062 cri.go:89] found id: ""
	I1217 20:34:47.618833  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.618840  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:47.618845  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:47.618906  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:47.643299  420062 cri.go:89] found id: ""
	I1217 20:34:47.643313  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.643320  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:47.643325  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:47.643386  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:47.668500  420062 cri.go:89] found id: ""
	I1217 20:34:47.668514  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.668522  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:47.668527  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:47.668588  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:47.694650  420062 cri.go:89] found id: ""
	I1217 20:34:47.694671  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.694678  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:47.694683  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:47.694745  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:47.729169  420062 cri.go:89] found id: ""
	I1217 20:34:47.729183  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.729192  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:47.729197  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:47.729258  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:47.753481  420062 cri.go:89] found id: ""
	I1217 20:34:47.753494  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.753501  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:47.753509  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:47.753521  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:47.768175  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:47.768192  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:47.832224  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:47.823643   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.824432   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826211   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826814   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.828509   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:47.823643   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.824432   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826211   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826814   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.828509   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:47.832234  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:47.832264  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:47.894275  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:47.894294  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:47.921621  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:47.921638  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:50.477347  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:50.487837  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:50.487905  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:50.515440  420062 cri.go:89] found id: ""
	I1217 20:34:50.515460  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.515468  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:50.515473  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:50.515545  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:50.542521  420062 cri.go:89] found id: ""
	I1217 20:34:50.542546  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.542553  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:50.542559  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:50.542629  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:50.569586  420062 cri.go:89] found id: ""
	I1217 20:34:50.569600  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.569613  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:50.569618  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:50.569677  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:50.597938  420062 cri.go:89] found id: ""
	I1217 20:34:50.597951  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.597958  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:50.597966  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:50.598024  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:50.627019  420062 cri.go:89] found id: ""
	I1217 20:34:50.627044  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.627052  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:50.627057  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:50.627128  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:50.655921  420062 cri.go:89] found id: ""
	I1217 20:34:50.655948  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.655956  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:50.655962  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:50.656028  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:50.680457  420062 cri.go:89] found id: ""
	I1217 20:34:50.680471  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.680479  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:50.680487  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:50.680502  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:50.742350  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:50.734040   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.734460   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736277   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736697   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.738252   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:50.734040   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.734460   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736277   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736697   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.738252   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:50.742360  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:50.742370  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:50.802977  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:50.802997  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:50.830354  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:50.830370  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:50.887850  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:50.887869  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:53.403065  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:53.413162  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:53.413227  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:53.437500  420062 cri.go:89] found id: ""
	I1217 20:34:53.437513  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.437521  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:53.437526  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:53.437592  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:53.462889  420062 cri.go:89] found id: ""
	I1217 20:34:53.462902  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.462910  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:53.462915  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:53.462972  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:53.493212  420062 cri.go:89] found id: ""
	I1217 20:34:53.493226  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.493234  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:53.493239  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:53.493301  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:53.521829  420062 cri.go:89] found id: ""
	I1217 20:34:53.521844  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.521851  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:53.521857  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:53.521919  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:53.558427  420062 cri.go:89] found id: ""
	I1217 20:34:53.558442  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.558449  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:53.558454  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:53.558513  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:53.583439  420062 cri.go:89] found id: ""
	I1217 20:34:53.583453  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.583460  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:53.583466  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:53.583526  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:53.608693  420062 cri.go:89] found id: ""
	I1217 20:34:53.608707  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.608714  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:53.608722  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:53.608732  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:53.664959  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:53.664980  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:53.679865  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:53.679886  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:53.742568  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:53.733840   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.734623   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736275   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736848   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.738561   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:53.733840   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.734623   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736275   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736848   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.738561   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:53.742579  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:53.742591  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:53.803297  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:53.803317  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:56.335304  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:56.344915  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:56.344977  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:56.368289  420062 cri.go:89] found id: ""
	I1217 20:34:56.368304  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.368312  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:56.368319  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:56.368388  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:56.392693  420062 cri.go:89] found id: ""
	I1217 20:34:56.392707  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.392715  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:56.392721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:56.392782  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:56.419795  420062 cri.go:89] found id: ""
	I1217 20:34:56.419809  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.419825  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:56.419834  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:56.419902  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:56.445038  420062 cri.go:89] found id: ""
	I1217 20:34:56.445052  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.445060  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:56.445065  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:56.445128  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:56.474272  420062 cri.go:89] found id: ""
	I1217 20:34:56.474287  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.474294  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:56.474300  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:56.474366  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:56.507935  420062 cri.go:89] found id: ""
	I1217 20:34:56.507950  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.507957  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:56.507963  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:56.508030  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:56.535999  420062 cri.go:89] found id: ""
	I1217 20:34:56.536012  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.536030  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:56.536039  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:56.536050  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:56.572020  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:56.572037  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:56.628661  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:56.628681  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:56.643833  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:56.643856  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:56.710351  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:56.701895   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.702686   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704396   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704960   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.706438   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:56.701895   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.702686   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704396   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704960   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.706438   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:56.710361  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:56.710380  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:59.273579  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:59.283581  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:59.283645  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:59.309480  420062 cri.go:89] found id: ""
	I1217 20:34:59.309493  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.309500  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:59.309506  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:59.309564  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:59.333365  420062 cri.go:89] found id: ""
	I1217 20:34:59.333378  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.333386  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:59.333391  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:59.333452  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:59.357207  420062 cri.go:89] found id: ""
	I1217 20:34:59.357221  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.357228  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:59.357233  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:59.357298  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:59.381758  420062 cri.go:89] found id: ""
	I1217 20:34:59.381772  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.381781  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:59.381787  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:59.381845  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:59.406750  420062 cri.go:89] found id: ""
	I1217 20:34:59.406764  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.406772  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:59.406777  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:59.406845  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:59.431825  420062 cri.go:89] found id: ""
	I1217 20:34:59.431838  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.431846  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:59.431852  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:59.431913  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:59.458993  420062 cri.go:89] found id: ""
	I1217 20:34:59.459007  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.459014  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:59.459022  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:59.459041  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:59.546381  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:59.527767   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.528143   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.536500   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.537248   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.538811   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:59.527767   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.528143   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.536500   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.537248   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.538811   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:59.546391  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:59.546401  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:59.613987  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:59.614007  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:59.644296  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:59.644311  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:59.703226  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:59.703245  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:02.218783  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:02.229042  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:02.229114  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:02.254286  420062 cri.go:89] found id: ""
	I1217 20:35:02.254300  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.254308  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:02.254315  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:02.254374  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:02.281092  420062 cri.go:89] found id: ""
	I1217 20:35:02.281106  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.281114  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:02.281120  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:02.281198  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:02.310195  420062 cri.go:89] found id: ""
	I1217 20:35:02.310209  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.310217  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:02.310222  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:02.310294  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:02.338807  420062 cri.go:89] found id: ""
	I1217 20:35:02.338821  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.338829  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:02.338834  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:02.338904  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:02.364604  420062 cri.go:89] found id: ""
	I1217 20:35:02.364618  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.364625  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:02.364631  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:02.364693  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:02.389458  420062 cri.go:89] found id: ""
	I1217 20:35:02.389473  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.389481  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:02.389486  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:02.389544  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:02.419120  420062 cri.go:89] found id: ""
	I1217 20:35:02.419134  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.419142  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:02.419151  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:02.419162  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:02.476620  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:02.476640  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:02.492411  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:02.492428  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:02.567285  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:02.558682   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.559341   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.560957   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.561461   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.562999   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:02.558682   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.559341   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.560957   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.561461   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.562999   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:02.567294  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:02.567308  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:02.635002  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:02.635022  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:05.163567  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:05.174184  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:05.174245  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:05.199116  420062 cri.go:89] found id: ""
	I1217 20:35:05.199130  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.199137  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:05.199143  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:05.199206  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:05.223477  420062 cri.go:89] found id: ""
	I1217 20:35:05.223491  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.223498  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:05.223504  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:05.223562  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:05.247303  420062 cri.go:89] found id: ""
	I1217 20:35:05.247317  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.247325  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:05.247332  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:05.247391  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:05.272620  420062 cri.go:89] found id: ""
	I1217 20:35:05.272633  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.272641  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:05.272646  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:05.272703  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:05.300419  420062 cri.go:89] found id: ""
	I1217 20:35:05.300434  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.300441  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:05.300446  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:05.300505  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:05.325851  420062 cri.go:89] found id: ""
	I1217 20:35:05.325866  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.325873  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:05.325879  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:05.325938  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:05.354430  420062 cri.go:89] found id: ""
	I1217 20:35:05.354445  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.354452  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:05.354460  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:05.354475  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:05.369668  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:05.369686  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:05.436390  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:05.427472   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.428087   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.429823   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.430630   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.432463   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:05.427472   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.428087   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.429823   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.430630   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.432463   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:05.436400  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:05.436411  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:05.499177  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:05.499202  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:05.531231  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:05.531248  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:08.088375  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:08.098640  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:08.098711  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:08.132112  420062 cri.go:89] found id: ""
	I1217 20:35:08.132127  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.132136  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:08.132141  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:08.132205  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:08.157778  420062 cri.go:89] found id: ""
	I1217 20:35:08.157792  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.157800  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:08.157805  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:08.157862  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:08.183372  420062 cri.go:89] found id: ""
	I1217 20:35:08.183386  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.183393  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:08.183399  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:08.183457  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:08.208186  420062 cri.go:89] found id: ""
	I1217 20:35:08.208200  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.208207  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:08.208212  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:08.208310  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:08.236181  420062 cri.go:89] found id: ""
	I1217 20:35:08.236195  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.236202  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:08.236207  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:08.236313  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:08.261508  420062 cri.go:89] found id: ""
	I1217 20:35:08.261522  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.261529  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:08.261534  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:08.261593  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:08.286303  420062 cri.go:89] found id: ""
	I1217 20:35:08.286318  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.286325  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:08.286333  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:08.286349  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:08.345547  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:08.345573  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:08.360551  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:08.360568  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:08.424581  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:08.415775   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.416570   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.418257   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.419005   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.420742   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:08.415775   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.416570   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.418257   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.419005   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.420742   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:08.424593  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:08.424606  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:08.489146  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:08.489166  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:11.022570  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:11.034138  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:11.034205  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:11.066795  420062 cri.go:89] found id: ""
	I1217 20:35:11.066810  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.066817  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:11.066825  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:11.066888  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:11.092902  420062 cri.go:89] found id: ""
	I1217 20:35:11.092917  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.092925  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:11.092931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:11.092998  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:11.120040  420062 cri.go:89] found id: ""
	I1217 20:35:11.120056  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.120064  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:11.120069  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:11.120138  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:11.150096  420062 cri.go:89] found id: ""
	I1217 20:35:11.150111  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.150118  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:11.150124  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:11.150186  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:11.178952  420062 cri.go:89] found id: ""
	I1217 20:35:11.178966  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.178973  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:11.178979  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:11.179042  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:11.205194  420062 cri.go:89] found id: ""
	I1217 20:35:11.205208  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.205215  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:11.205221  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:11.205281  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:11.231314  420062 cri.go:89] found id: ""
	I1217 20:35:11.231327  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.231335  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:11.231343  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:11.231355  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:11.246458  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:11.246475  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:11.312684  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:11.304393   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.305171   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.306693   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.307058   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.308710   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:11.304393   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.305171   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.306693   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.307058   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.308710   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:11.312696  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:11.312706  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:11.379354  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:11.379374  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:11.413484  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:11.413500  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:13.972078  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:13.982223  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:13.982290  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:14.022488  420062 cri.go:89] found id: ""
	I1217 20:35:14.022502  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.022510  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:14.022515  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:14.022575  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:14.059328  420062 cri.go:89] found id: ""
	I1217 20:35:14.059342  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.059364  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:14.059369  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:14.059435  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:14.085531  420062 cri.go:89] found id: ""
	I1217 20:35:14.085544  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.085552  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:14.085558  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:14.085616  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:14.114113  420062 cri.go:89] found id: ""
	I1217 20:35:14.114134  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.114141  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:14.114147  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:14.114210  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:14.138505  420062 cri.go:89] found id: ""
	I1217 20:35:14.138519  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.138526  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:14.138532  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:14.138591  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:14.162838  420062 cri.go:89] found id: ""
	I1217 20:35:14.162852  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.162858  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:14.162863  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:14.162923  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:14.190631  420062 cri.go:89] found id: ""
	I1217 20:35:14.190651  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.190665  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:14.190672  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:14.190682  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:14.246544  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:14.246563  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:14.261703  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:14.261719  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:14.327698  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:14.319587   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.320376   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322035   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322354   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.323849   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:14.319587   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.320376   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322035   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322354   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.323849   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:14.327708  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:14.327721  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:14.391616  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:14.391635  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:16.921553  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:16.931542  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:16.931604  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:16.955206  420062 cri.go:89] found id: ""
	I1217 20:35:16.955220  420062 logs.go:282] 0 containers: []
	W1217 20:35:16.955227  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:16.955233  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:16.955291  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:16.984598  420062 cri.go:89] found id: ""
	I1217 20:35:16.984613  420062 logs.go:282] 0 containers: []
	W1217 20:35:16.984620  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:16.984625  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:16.984683  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:17.033712  420062 cri.go:89] found id: ""
	I1217 20:35:17.033726  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.033733  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:17.033739  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:17.033796  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:17.061936  420062 cri.go:89] found id: ""
	I1217 20:35:17.061950  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.061957  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:17.061963  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:17.062023  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:17.086921  420062 cri.go:89] found id: ""
	I1217 20:35:17.086936  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.086943  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:17.086948  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:17.087009  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:17.112474  420062 cri.go:89] found id: ""
	I1217 20:35:17.112488  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.112495  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:17.112501  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:17.112558  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:17.137847  420062 cri.go:89] found id: ""
	I1217 20:35:17.137867  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.137875  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:17.137882  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:17.137892  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:17.198885  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:17.198904  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:17.213637  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:17.213652  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:17.281467  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:17.272943   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.273684   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275273   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275893   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.277419   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:17.272943   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.273684   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275273   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275893   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.277419   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:17.281478  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:17.281488  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:17.343313  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:17.343334  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:19.871984  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:19.882066  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:19.882128  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:19.907664  420062 cri.go:89] found id: ""
	I1217 20:35:19.907678  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.907686  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:19.907691  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:19.907750  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:19.936014  420062 cri.go:89] found id: ""
	I1217 20:35:19.936028  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.936035  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:19.936040  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:19.936099  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:19.961865  420062 cri.go:89] found id: ""
	I1217 20:35:19.961881  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.961888  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:19.961893  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:19.961954  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:19.988749  420062 cri.go:89] found id: ""
	I1217 20:35:19.988762  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.988769  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:19.988775  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:19.988832  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:20.021844  420062 cri.go:89] found id: ""
	I1217 20:35:20.021859  420062 logs.go:282] 0 containers: []
	W1217 20:35:20.021866  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:20.021873  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:20.021936  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:20.064328  420062 cri.go:89] found id: ""
	I1217 20:35:20.064343  420062 logs.go:282] 0 containers: []
	W1217 20:35:20.064351  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:20.064356  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:20.064464  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:20.092230  420062 cri.go:89] found id: ""
	I1217 20:35:20.092244  420062 logs.go:282] 0 containers: []
	W1217 20:35:20.092272  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:20.092280  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:20.092291  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:20.150597  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:20.150617  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:20.166734  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:20.166751  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:20.235344  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:20.226511   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.227342   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.228855   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.229349   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.230876   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:20.226511   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.227342   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.228855   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.229349   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.230876   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:20.235354  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:20.235368  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:20.300971  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:20.300991  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:22.830503  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:22.840565  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:22.840627  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:22.865965  420062 cri.go:89] found id: ""
	I1217 20:35:22.865980  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.865987  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:22.865992  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:22.866051  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:22.890981  420062 cri.go:89] found id: ""
	I1217 20:35:22.890995  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.891002  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:22.891007  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:22.891067  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:22.916050  420062 cri.go:89] found id: ""
	I1217 20:35:22.916064  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.916070  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:22.916075  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:22.916134  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:22.940231  420062 cri.go:89] found id: ""
	I1217 20:35:22.940244  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.940274  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:22.940280  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:22.940338  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:22.964651  420062 cri.go:89] found id: ""
	I1217 20:35:22.964665  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.964673  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:22.964678  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:22.964739  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:22.999102  420062 cri.go:89] found id: ""
	I1217 20:35:22.999118  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.999126  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:22.999133  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:22.999201  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:23.031417  420062 cri.go:89] found id: ""
	I1217 20:35:23.031431  420062 logs.go:282] 0 containers: []
	W1217 20:35:23.031440  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:23.031447  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:23.031458  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:23.099279  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:23.099300  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:23.127896  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:23.127914  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:23.184706  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:23.184725  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:23.199879  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:23.199895  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:23.267184  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:23.258603   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.259294   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.260943   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.261532   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.263117   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:23.258603   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.259294   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.260943   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.261532   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.263117   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:25.768885  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:25.778947  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:25.779017  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:25.802991  420062 cri.go:89] found id: ""
	I1217 20:35:25.803005  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.803025  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:25.803031  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:25.803093  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:25.830724  420062 cri.go:89] found id: ""
	I1217 20:35:25.830738  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.830745  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:25.830751  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:25.830813  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:25.860059  420062 cri.go:89] found id: ""
	I1217 20:35:25.860073  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.860081  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:25.860085  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:25.860150  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:25.896087  420062 cri.go:89] found id: ""
	I1217 20:35:25.896101  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.896108  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:25.896114  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:25.896173  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:25.921891  420062 cri.go:89] found id: ""
	I1217 20:35:25.921905  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.921912  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:25.921918  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:25.921975  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:25.946115  420062 cri.go:89] found id: ""
	I1217 20:35:25.946129  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.946137  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:25.946142  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:25.946199  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:25.970696  420062 cri.go:89] found id: ""
	I1217 20:35:25.970711  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.970719  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:25.970727  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:25.970737  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:26.031476  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:26.031497  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:26.053026  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:26.053044  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:26.121268  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:26.112221   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.113175   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.114729   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.115229   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.116856   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:26.112221   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.113175   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.114729   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.115229   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.116856   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:26.121279  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:26.121290  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:26.183866  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:26.183888  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:28.713125  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:28.723373  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:28.723436  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:28.750204  420062 cri.go:89] found id: ""
	I1217 20:35:28.750218  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.750225  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:28.750231  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:28.750295  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:28.774507  420062 cri.go:89] found id: ""
	I1217 20:35:28.774520  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.774528  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:28.774533  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:28.774593  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:28.799202  420062 cri.go:89] found id: ""
	I1217 20:35:28.799217  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.799225  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:28.799230  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:28.799295  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:28.823894  420062 cri.go:89] found id: ""
	I1217 20:35:28.823908  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.823916  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:28.823921  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:28.823981  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:28.848696  420062 cri.go:89] found id: ""
	I1217 20:35:28.848710  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.848717  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:28.848722  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:28.848780  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:28.874108  420062 cri.go:89] found id: ""
	I1217 20:35:28.874121  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.874129  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:28.874146  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:28.874206  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:28.899607  420062 cri.go:89] found id: ""
	I1217 20:35:28.899621  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.899628  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:28.899636  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:28.899646  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:28.955990  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:28.956010  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:28.970828  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:28.970844  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:29.048596  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:29.039925   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.040773   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042371   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042731   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.044197   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:29.039925   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.040773   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042371   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042731   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.044197   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:29.048606  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:29.048627  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:29.115475  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:29.115495  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:31.644907  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:31.654819  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:31.654879  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:31.678281  420062 cri.go:89] found id: ""
	I1217 20:35:31.678295  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.678303  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:31.678308  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:31.678370  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:31.702902  420062 cri.go:89] found id: ""
	I1217 20:35:31.702916  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.702923  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:31.702929  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:31.702988  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:31.730614  420062 cri.go:89] found id: ""
	I1217 20:35:31.730629  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.730643  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:31.730648  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:31.730715  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:31.757724  420062 cri.go:89] found id: ""
	I1217 20:35:31.757738  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.757745  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:31.757751  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:31.757821  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:31.781313  420062 cri.go:89] found id: ""
	I1217 20:35:31.781326  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.781333  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:31.781338  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:31.781401  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:31.805048  420062 cri.go:89] found id: ""
	I1217 20:35:31.805061  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.805068  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:31.805074  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:31.805133  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:31.829157  420062 cri.go:89] found id: ""
	I1217 20:35:31.829172  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.829178  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:31.829186  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:31.829211  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:31.884232  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:31.884262  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:31.899125  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:31.899143  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:31.960768  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:31.952914   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.953466   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.954986   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.955445   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.957040   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:31.952914   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.953466   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.954986   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.955445   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.957040   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:31.960779  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:31.960789  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:32.026560  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:32.026580  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:34.561956  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:34.573345  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:34.573414  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:34.601971  420062 cri.go:89] found id: ""
	I1217 20:35:34.601985  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.601993  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:34.601998  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:34.602057  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:34.631487  420062 cri.go:89] found id: ""
	I1217 20:35:34.631500  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.631508  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:34.631513  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:34.631572  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:34.656452  420062 cri.go:89] found id: ""
	I1217 20:35:34.656465  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.656473  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:34.656478  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:34.656540  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:34.682582  420062 cri.go:89] found id: ""
	I1217 20:35:34.682596  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.682603  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:34.682609  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:34.682676  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:34.713925  420062 cri.go:89] found id: ""
	I1217 20:35:34.713939  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.713947  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:34.713952  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:34.714017  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:34.742385  420062 cri.go:89] found id: ""
	I1217 20:35:34.742400  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.742408  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:34.742414  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:34.742473  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:34.767035  420062 cri.go:89] found id: ""
	I1217 20:35:34.767049  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.767056  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:34.767064  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:34.767075  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:34.822796  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:34.822817  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:34.837590  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:34.837613  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:34.900508  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:34.892940   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.893576   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895113   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895412   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.896864   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:34.892940   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.893576   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895113   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895412   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.896864   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:34.900518  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:34.900529  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:34.962881  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:34.962905  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:37.494984  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:37.505451  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:37.505514  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:37.530852  420062 cri.go:89] found id: ""
	I1217 20:35:37.530866  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.530874  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:37.530885  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:37.530948  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:37.555283  420062 cri.go:89] found id: ""
	I1217 20:35:37.555298  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.555305  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:37.555319  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:37.555384  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:37.580310  420062 cri.go:89] found id: ""
	I1217 20:35:37.580324  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.580342  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:37.580347  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:37.580407  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:37.604561  420062 cri.go:89] found id: ""
	I1217 20:35:37.604575  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.604582  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:37.604587  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:37.604649  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:37.633577  420062 cri.go:89] found id: ""
	I1217 20:35:37.633591  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.633598  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:37.633603  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:37.633668  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:37.659137  420062 cri.go:89] found id: ""
	I1217 20:35:37.659152  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.659159  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:37.659183  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:37.659280  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:37.687689  420062 cri.go:89] found id: ""
	I1217 20:35:37.687704  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.687711  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:37.687719  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:37.687738  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:37.742459  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:37.742478  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:37.757175  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:37.757191  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:37.822005  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:37.813077   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.813679   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.815702   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.816474   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.817981   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:37.813077   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.813679   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.815702   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.816474   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.817981   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:37.822015  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:37.822025  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:37.885848  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:37.885870  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:40.416602  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:40.427031  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:40.427099  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:40.452190  420062 cri.go:89] found id: ""
	I1217 20:35:40.452204  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.452212  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:40.452218  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:40.452299  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:40.478942  420062 cri.go:89] found id: ""
	I1217 20:35:40.478956  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.478963  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:40.478969  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:40.479027  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:40.504873  420062 cri.go:89] found id: ""
	I1217 20:35:40.504886  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.504893  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:40.504898  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:40.504958  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:40.530153  420062 cri.go:89] found id: ""
	I1217 20:35:40.530167  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.530173  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:40.530179  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:40.530239  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:40.558703  420062 cri.go:89] found id: ""
	I1217 20:35:40.558717  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.558725  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:40.558731  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:40.558799  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:40.583753  420062 cri.go:89] found id: ""
	I1217 20:35:40.583768  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.583777  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:40.583793  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:40.583856  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:40.608061  420062 cri.go:89] found id: ""
	I1217 20:35:40.608075  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.608083  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:40.608099  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:40.608111  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:40.665201  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:40.665222  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:40.680290  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:40.680307  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:40.752424  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:40.739302   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.740073   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.745453   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.746616   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.748372   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:40.739302   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.740073   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.745453   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.746616   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.748372   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:40.752435  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:40.752446  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:40.819510  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:40.819535  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:43.356404  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:43.367228  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:43.367293  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:43.391809  420062 cri.go:89] found id: ""
	I1217 20:35:43.391824  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.391831  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:43.391836  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:43.391895  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:43.417869  420062 cri.go:89] found id: ""
	I1217 20:35:43.417883  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.417890  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:43.417895  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:43.417959  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:43.443314  420062 cri.go:89] found id: ""
	I1217 20:35:43.443328  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.443335  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:43.443340  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:43.443400  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:43.469332  420062 cri.go:89] found id: ""
	I1217 20:35:43.469346  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.469352  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:43.469358  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:43.469418  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:43.494242  420062 cri.go:89] found id: ""
	I1217 20:35:43.494256  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.494264  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:43.494277  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:43.494341  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:43.520502  420062 cri.go:89] found id: ""
	I1217 20:35:43.520515  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.520523  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:43.520529  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:43.520592  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:43.549390  420062 cri.go:89] found id: ""
	I1217 20:35:43.549404  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.549411  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:43.549419  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:43.549435  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:43.565708  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:43.565725  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:43.633544  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:43.624678   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.625383   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627234   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627820   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.629497   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:43.624678   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.625383   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627234   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627820   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.629497   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:43.633555  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:43.633567  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:43.696433  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:43.696457  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:43.727227  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:43.727244  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:46.288373  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:46.298318  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:46.298381  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:46.322903  420062 cri.go:89] found id: ""
	I1217 20:35:46.322918  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.322925  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:46.322931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:46.322992  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:46.347241  420062 cri.go:89] found id: ""
	I1217 20:35:46.347253  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.347260  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:46.347265  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:46.347324  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:46.372209  420062 cri.go:89] found id: ""
	I1217 20:35:46.372222  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.372229  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:46.372235  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:46.372313  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:46.399343  420062 cri.go:89] found id: ""
	I1217 20:35:46.399357  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.399365  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:46.399370  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:46.399430  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:46.425023  420062 cri.go:89] found id: ""
	I1217 20:35:46.425036  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.425051  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:46.425057  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:46.425119  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:46.450066  420062 cri.go:89] found id: ""
	I1217 20:35:46.450080  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.450087  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:46.450092  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:46.450153  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:46.474598  420062 cri.go:89] found id: ""
	I1217 20:35:46.474612  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.474619  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:46.474644  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:46.474654  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:46.536781  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:46.536801  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:46.570140  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:46.570155  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:46.628870  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:46.628888  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:46.643875  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:46.643891  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:46.709883  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:46.701485   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.702111   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.703801   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.704133   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.705726   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:46.701485   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.702111   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.703801   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.704133   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.705726   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:49.210139  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:49.220394  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:49.220461  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:49.256343  420062 cri.go:89] found id: ""
	I1217 20:35:49.256358  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.256365  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:49.256370  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:49.256431  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:49.290171  420062 cri.go:89] found id: ""
	I1217 20:35:49.290185  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.290193  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:49.290198  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:49.290261  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:49.320916  420062 cri.go:89] found id: ""
	I1217 20:35:49.320931  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.320939  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:49.320944  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:49.321003  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:49.345394  420062 cri.go:89] found id: ""
	I1217 20:35:49.345408  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.345415  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:49.345421  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:49.345478  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:49.370339  420062 cri.go:89] found id: ""
	I1217 20:35:49.370353  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.370360  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:49.370365  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:49.370424  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:49.394642  420062 cri.go:89] found id: ""
	I1217 20:35:49.394656  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.394663  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:49.394668  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:49.394734  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:49.422548  420062 cri.go:89] found id: ""
	I1217 20:35:49.422562  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.422569  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:49.422577  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:49.422594  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:49.479225  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:49.479246  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:49.494238  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:49.494255  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:49.560086  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:49.552332   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.552825   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554311   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554738   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.556232   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:49.552332   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.552825   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554311   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554738   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.556232   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:49.560096  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:49.560106  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:49.622094  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:49.622114  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:52.150210  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:52.160168  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:52.160231  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:52.184746  420062 cri.go:89] found id: ""
	I1217 20:35:52.184760  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.184767  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:52.184779  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:52.184835  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:52.209501  420062 cri.go:89] found id: ""
	I1217 20:35:52.209515  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.209522  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:52.209528  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:52.209586  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:52.234558  420062 cri.go:89] found id: ""
	I1217 20:35:52.234571  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.234579  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:52.234584  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:52.234654  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:52.265703  420062 cri.go:89] found id: ""
	I1217 20:35:52.265716  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.265724  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:52.265729  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:52.265794  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:52.297248  420062 cri.go:89] found id: ""
	I1217 20:35:52.297263  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.297270  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:52.297275  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:52.297334  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:52.325342  420062 cri.go:89] found id: ""
	I1217 20:35:52.325355  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.325362  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:52.325367  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:52.325433  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:52.349812  420062 cri.go:89] found id: ""
	I1217 20:35:52.349826  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.349843  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:52.349851  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:52.349862  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:52.380735  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:52.380751  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:52.436131  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:52.436151  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:52.451427  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:52.451445  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:52.518482  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:52.509497   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.510168   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.512343   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.513295   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.514564   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:52.509497   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.510168   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.512343   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.513295   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.514564   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:52.518492  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:52.518503  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:55.081073  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:55.091720  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:55.091797  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:55.117311  420062 cri.go:89] found id: ""
	I1217 20:35:55.117325  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.117333  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:55.117338  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:55.117398  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:55.141668  420062 cri.go:89] found id: ""
	I1217 20:35:55.141683  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.141692  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:55.141697  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:55.141760  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:55.166517  420062 cri.go:89] found id: ""
	I1217 20:35:55.166534  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.166541  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:55.166546  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:55.166611  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:55.191282  420062 cri.go:89] found id: ""
	I1217 20:35:55.191296  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.191304  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:55.191309  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:55.191369  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:55.215605  420062 cri.go:89] found id: ""
	I1217 20:35:55.215619  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.215626  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:55.215631  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:55.215690  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:55.247101  420062 cri.go:89] found id: ""
	I1217 20:35:55.247124  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.247132  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:55.247137  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:55.247205  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:55.288704  420062 cri.go:89] found id: ""
	I1217 20:35:55.288718  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.288725  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:55.288732  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:55.288743  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:55.320382  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:55.320398  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:55.379997  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:55.380016  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:55.394762  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:55.394780  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:55.459997  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:55.451538   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.452219   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.453851   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.454661   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.456164   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:55.451538   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.452219   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.453851   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.454661   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.456164   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:55.460007  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:55.460018  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:58.024408  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:58.035410  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:58.035478  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:58.062124  420062 cri.go:89] found id: ""
	I1217 20:35:58.062138  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.062145  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:58.062151  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:58.062211  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:58.088229  420062 cri.go:89] found id: ""
	I1217 20:35:58.088243  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.088270  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:58.088276  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:58.088335  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:58.113240  420062 cri.go:89] found id: ""
	I1217 20:35:58.113255  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.113261  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:58.113266  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:58.113325  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:58.141811  420062 cri.go:89] found id: ""
	I1217 20:35:58.141825  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.141832  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:58.141837  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:58.141897  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:58.170463  420062 cri.go:89] found id: ""
	I1217 20:35:58.170477  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.170484  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:58.170490  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:58.170548  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:58.194647  420062 cri.go:89] found id: ""
	I1217 20:35:58.194670  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.194678  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:58.194684  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:58.194760  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:58.219714  420062 cri.go:89] found id: ""
	I1217 20:35:58.219728  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.219735  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:58.219743  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:58.219754  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:58.263178  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:58.263194  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:58.325412  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:58.325433  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:58.341419  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:58.341435  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:58.403135  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:58.394644   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.395273   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.396931   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.397587   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.399184   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:58.394644   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.395273   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.396931   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.397587   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.399184   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:58.403147  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:58.403163  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:00.965498  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:00.975759  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:00.975820  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:01.000786  420062 cri.go:89] found id: ""
	I1217 20:36:01.000803  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.000811  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:01.000818  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:01.000892  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:01.025695  420062 cri.go:89] found id: ""
	I1217 20:36:01.025709  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.025716  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:01.025721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:01.025784  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:01.054712  420062 cri.go:89] found id: ""
	I1217 20:36:01.054727  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.054734  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:01.054739  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:01.054799  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:01.083318  420062 cri.go:89] found id: ""
	I1217 20:36:01.083332  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.083340  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:01.083345  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:01.083406  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:01.107939  420062 cri.go:89] found id: ""
	I1217 20:36:01.107954  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.107962  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:01.107968  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:01.108030  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:01.134926  420062 cri.go:89] found id: ""
	I1217 20:36:01.134940  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.134947  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:01.134954  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:01.135018  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:01.161095  420062 cri.go:89] found id: ""
	I1217 20:36:01.161111  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.161121  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:01.161130  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:01.161141  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:01.222094  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:01.222112  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:01.239432  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:01.239449  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:01.331243  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:01.322562   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.323102   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.324888   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.325430   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.327051   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:01.322562   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.323102   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.324888   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.325430   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.327051   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:01.331254  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:01.331265  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:01.398128  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:01.398148  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:03.929660  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:03.940045  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:03.940111  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:03.963644  420062 cri.go:89] found id: ""
	I1217 20:36:03.963658  420062 logs.go:282] 0 containers: []
	W1217 20:36:03.963665  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:03.963670  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:03.963727  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:03.996893  420062 cri.go:89] found id: ""
	I1217 20:36:03.996907  420062 logs.go:282] 0 containers: []
	W1217 20:36:03.996914  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:03.996919  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:03.996987  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:04.028499  420062 cri.go:89] found id: ""
	I1217 20:36:04.028514  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.028530  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:04.028535  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:04.028607  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:04.054700  420062 cri.go:89] found id: ""
	I1217 20:36:04.054715  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.054723  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:04.054728  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:04.054785  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:04.082040  420062 cri.go:89] found id: ""
	I1217 20:36:04.082054  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.082063  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:04.082068  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:04.082131  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:04.107015  420062 cri.go:89] found id: ""
	I1217 20:36:04.107029  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.107037  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:04.107043  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:04.107109  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:04.134634  420062 cri.go:89] found id: ""
	I1217 20:36:04.134648  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.134655  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:04.134663  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:04.134673  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:04.191059  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:04.191079  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:04.206280  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:04.206298  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:04.297698  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:04.288551   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.289379   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.290529   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.291295   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.292969   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:04.288551   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.289379   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.290529   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.291295   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.292969   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:04.297708  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:04.297719  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:04.364378  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:04.364398  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:06.892149  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:06.902353  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:06.902418  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:06.927834  420062 cri.go:89] found id: ""
	I1217 20:36:06.927847  420062 logs.go:282] 0 containers: []
	W1217 20:36:06.927855  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:06.927860  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:06.927925  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:06.952936  420062 cri.go:89] found id: ""
	I1217 20:36:06.952949  420062 logs.go:282] 0 containers: []
	W1217 20:36:06.952956  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:06.952965  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:06.953024  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:06.976184  420062 cri.go:89] found id: ""
	I1217 20:36:06.976198  420062 logs.go:282] 0 containers: []
	W1217 20:36:06.976205  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:06.976210  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:06.976297  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:07.004079  420062 cri.go:89] found id: ""
	I1217 20:36:07.004093  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.004101  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:07.004106  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:07.004167  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:07.029604  420062 cri.go:89] found id: ""
	I1217 20:36:07.029618  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.029625  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:07.029630  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:07.029698  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:07.058618  420062 cri.go:89] found id: ""
	I1217 20:36:07.058637  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.058645  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:07.058650  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:07.058709  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:07.085932  420062 cri.go:89] found id: ""
	I1217 20:36:07.085946  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.085953  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:07.085961  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:07.085972  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:07.100543  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:07.100561  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:07.162557  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:07.154011   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.154703   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.156341   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.157015   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.158662   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:07.154011   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.154703   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.156341   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.157015   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.158662   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:07.162567  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:07.162578  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:07.226244  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:07.226265  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:07.280558  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:07.280574  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:09.844282  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:09.854593  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:09.854676  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:09.883180  420062 cri.go:89] found id: ""
	I1217 20:36:09.883194  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.883202  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:09.883208  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:09.883268  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:09.907225  420062 cri.go:89] found id: ""
	I1217 20:36:09.907240  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.907248  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:09.907254  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:09.907315  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:09.936079  420062 cri.go:89] found id: ""
	I1217 20:36:09.936093  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.936100  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:09.936105  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:09.936167  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:09.961921  420062 cri.go:89] found id: ""
	I1217 20:36:09.961935  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.961943  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:09.961949  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:09.962028  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:09.989285  420062 cri.go:89] found id: ""
	I1217 20:36:09.989299  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.989307  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:09.989312  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:09.989371  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:10.023888  420062 cri.go:89] found id: ""
	I1217 20:36:10.023905  420062 logs.go:282] 0 containers: []
	W1217 20:36:10.023913  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:10.023920  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:10.023992  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:10.056062  420062 cri.go:89] found id: ""
	I1217 20:36:10.056077  420062 logs.go:282] 0 containers: []
	W1217 20:36:10.056084  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:10.056102  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:10.056112  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:10.118144  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:10.118165  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:10.153504  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:10.153521  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:10.209909  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:10.209931  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:10.224930  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:10.224946  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:10.310457  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:10.302878   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.303301   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304492   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304881   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.306456   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:10.302878   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.303301   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304492   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304881   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.306456   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:12.811296  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:12.821279  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:12.821339  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:12.845496  420062 cri.go:89] found id: ""
	I1217 20:36:12.845510  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.845519  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:12.845524  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:12.845582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:12.873951  420062 cri.go:89] found id: ""
	I1217 20:36:12.873966  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.873973  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:12.873978  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:12.874039  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:12.898560  420062 cri.go:89] found id: ""
	I1217 20:36:12.898573  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.898580  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:12.898586  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:12.898661  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:12.931323  420062 cri.go:89] found id: ""
	I1217 20:36:12.931343  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.931350  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:12.931356  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:12.931416  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:12.957667  420062 cri.go:89] found id: ""
	I1217 20:36:12.957680  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.957687  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:12.957692  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:12.957749  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:12.981848  420062 cri.go:89] found id: ""
	I1217 20:36:12.981863  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.981870  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:12.981876  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:12.981934  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:13.007649  420062 cri.go:89] found id: ""
	I1217 20:36:13.007664  420062 logs.go:282] 0 containers: []
	W1217 20:36:13.007671  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:13.007679  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:13.007689  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:13.070827  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:13.070846  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:13.098938  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:13.098954  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:13.155232  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:13.155253  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:13.170218  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:13.170234  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:13.237601  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:13.228684   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.229296   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.230990   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.231505   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.233204   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:13.228684   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.229296   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.230990   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.231505   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.233204   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:15.739451  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:15.749635  420062 kubeadm.go:602] duration metric: took 4m4.768391835s to restartPrimaryControlPlane
	W1217 20:36:15.749706  420062 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1217 20:36:15.749781  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1217 20:36:16.165425  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1217 20:36:16.179463  420062 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1217 20:36:16.187987  420062 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1217 20:36:16.188041  420062 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 20:36:16.195805  420062 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1217 20:36:16.195815  420062 kubeadm.go:158] found existing configuration files:
	
	I1217 20:36:16.195868  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1217 20:36:16.203578  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1217 20:36:16.203633  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1217 20:36:16.211222  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1217 20:36:16.218882  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1217 20:36:16.218939  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 20:36:16.226500  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1217 20:36:16.233980  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1217 20:36:16.234040  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 20:36:16.241486  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1217 20:36:16.250121  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1217 20:36:16.250177  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 20:36:16.257963  420062 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1217 20:36:16.296719  420062 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1217 20:36:16.297028  420062 kubeadm.go:319] [preflight] Running pre-flight checks
	I1217 20:36:16.367021  420062 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1217 20:36:16.367085  420062 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1217 20:36:16.367119  420062 kubeadm.go:319] OS: Linux
	I1217 20:36:16.367163  420062 kubeadm.go:319] CGROUPS_CPU: enabled
	I1217 20:36:16.367211  420062 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1217 20:36:16.367257  420062 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1217 20:36:16.367304  420062 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1217 20:36:16.367351  420062 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1217 20:36:16.367397  420062 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1217 20:36:16.367441  420062 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1217 20:36:16.367493  420062 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1217 20:36:16.367539  420062 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1217 20:36:16.443855  420062 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1217 20:36:16.443958  420062 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1217 20:36:16.444047  420062 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1217 20:36:16.456800  420062 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1217 20:36:16.459720  420062 out.go:252]   - Generating certificates and keys ...
	I1217 20:36:16.459808  420062 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1217 20:36:16.459875  420062 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1217 20:36:16.459957  420062 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1217 20:36:16.460026  420062 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1217 20:36:16.460100  420062 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1217 20:36:16.460156  420062 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1217 20:36:16.460222  420062 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1217 20:36:16.460299  420062 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1217 20:36:16.460377  420062 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1217 20:36:16.460454  420062 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1217 20:36:16.460493  420062 kubeadm.go:319] [certs] Using the existing "sa" key
	I1217 20:36:16.460552  420062 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1217 20:36:16.591707  420062 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1217 20:36:16.773515  420062 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1217 20:36:16.895942  420062 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1217 20:36:17.316963  420062 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1217 20:36:17.418134  420062 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1217 20:36:17.418872  420062 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1217 20:36:17.421748  420062 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1217 20:36:17.424898  420062 out.go:252]   - Booting up control plane ...
	I1217 20:36:17.424999  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1217 20:36:17.425075  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1217 20:36:17.425522  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1217 20:36:17.446706  420062 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1217 20:36:17.446809  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1217 20:36:17.455830  420062 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1217 20:36:17.455925  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1217 20:36:17.455963  420062 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1217 20:36:17.596746  420062 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1217 20:36:17.596869  420062 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1217 20:40:17.595000  420062 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000220112s
	I1217 20:40:17.595032  420062 kubeadm.go:319] 
	I1217 20:40:17.595086  420062 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1217 20:40:17.595116  420062 kubeadm.go:319] 	- The kubelet is not running
	I1217 20:40:17.595215  420062 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1217 20:40:17.595220  420062 kubeadm.go:319] 
	I1217 20:40:17.595317  420062 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1217 20:40:17.595346  420062 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1217 20:40:17.595375  420062 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1217 20:40:17.595378  420062 kubeadm.go:319] 
	I1217 20:40:17.599582  420062 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1217 20:40:17.600077  420062 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1217 20:40:17.600181  420062 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1217 20:40:17.600461  420062 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1217 20:40:17.600468  420062 kubeadm.go:319] 
	I1217 20:40:17.600540  420062 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1217 20:40:17.600694  420062 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000220112s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1217 20:40:17.600780  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1217 20:40:18.014309  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1217 20:40:18.029681  420062 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1217 20:40:18.029742  420062 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 20:40:18.038728  420062 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1217 20:40:18.038739  420062 kubeadm.go:158] found existing configuration files:
	
	I1217 20:40:18.038796  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1217 20:40:18.047726  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1217 20:40:18.047785  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1217 20:40:18.056139  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1217 20:40:18.064964  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1217 20:40:18.065020  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 20:40:18.073071  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1217 20:40:18.081347  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1217 20:40:18.081407  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 20:40:18.089386  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1217 20:40:18.097546  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1217 20:40:18.097608  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 20:40:18.105445  420062 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1217 20:40:18.146508  420062 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1217 20:40:18.146883  420062 kubeadm.go:319] [preflight] Running pre-flight checks
	I1217 20:40:18.223079  420062 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1217 20:40:18.223139  420062 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1217 20:40:18.223171  420062 kubeadm.go:319] OS: Linux
	I1217 20:40:18.223212  420062 kubeadm.go:319] CGROUPS_CPU: enabled
	I1217 20:40:18.223257  420062 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1217 20:40:18.223306  420062 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1217 20:40:18.223354  420062 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1217 20:40:18.223398  420062 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1217 20:40:18.223442  420062 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1217 20:40:18.223484  420062 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1217 20:40:18.223529  420062 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1217 20:40:18.223571  420062 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1217 20:40:18.290116  420062 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1217 20:40:18.290214  420062 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1217 20:40:18.290297  420062 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1217 20:40:18.296827  420062 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1217 20:40:18.300313  420062 out.go:252]   - Generating certificates and keys ...
	I1217 20:40:18.300404  420062 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1217 20:40:18.300483  420062 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1217 20:40:18.300564  420062 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1217 20:40:18.300623  420062 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1217 20:40:18.300692  420062 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1217 20:40:18.300745  420062 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1217 20:40:18.300806  420062 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1217 20:40:18.300867  420062 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1217 20:40:18.300940  420062 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1217 20:40:18.301011  420062 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1217 20:40:18.301047  420062 kubeadm.go:319] [certs] Using the existing "sa" key
	I1217 20:40:18.301101  420062 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1217 20:40:18.651136  420062 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1217 20:40:18.865861  420062 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1217 20:40:19.156184  420062 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1217 20:40:19.613234  420062 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1217 20:40:19.777874  420062 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1217 20:40:19.778689  420062 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1217 20:40:19.781521  420062 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1217 20:40:19.784636  420062 out.go:252]   - Booting up control plane ...
	I1217 20:40:19.784726  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1217 20:40:19.784798  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1217 20:40:19.786110  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1217 20:40:19.806173  420062 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1217 20:40:19.806463  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1217 20:40:19.814039  420062 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1217 20:40:19.814294  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1217 20:40:19.814465  420062 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1217 20:40:19.960654  420062 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1217 20:40:19.960777  420062 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1217 20:44:19.954818  420062 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001239508s
	I1217 20:44:19.954843  420062 kubeadm.go:319] 
	I1217 20:44:19.954896  420062 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1217 20:44:19.954927  420062 kubeadm.go:319] 	- The kubelet is not running
	I1217 20:44:19.955102  420062 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1217 20:44:19.955108  420062 kubeadm.go:319] 
	I1217 20:44:19.955205  420062 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1217 20:44:19.955233  420062 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1217 20:44:19.955262  420062 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1217 20:44:19.955265  420062 kubeadm.go:319] 
	I1217 20:44:19.960153  420062 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1217 20:44:19.960582  420062 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1217 20:44:19.960689  420062 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1217 20:44:19.960924  420062 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1217 20:44:19.960929  420062 kubeadm.go:319] 
	I1217 20:44:19.960996  420062 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1217 20:44:19.961048  420062 kubeadm.go:403] duration metric: took 12m9.01968184s to StartCluster
	I1217 20:44:19.961079  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:44:19.961139  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:44:19.999166  420062 cri.go:89] found id: ""
	I1217 20:44:19.999182  420062 logs.go:282] 0 containers: []
	W1217 20:44:19.999190  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:44:19.999195  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:44:19.999265  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:44:20.031203  420062 cri.go:89] found id: ""
	I1217 20:44:20.031218  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.031225  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:44:20.031230  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:44:20.031293  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:44:20.061179  420062 cri.go:89] found id: ""
	I1217 20:44:20.061193  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.061200  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:44:20.061219  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:44:20.061280  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:44:20.089093  420062 cri.go:89] found id: ""
	I1217 20:44:20.089107  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.089114  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:44:20.089120  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:44:20.089183  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:44:20.119683  420062 cri.go:89] found id: ""
	I1217 20:44:20.119696  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.119704  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:44:20.119709  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:44:20.119772  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:44:20.145500  420062 cri.go:89] found id: ""
	I1217 20:44:20.145514  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.145521  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:44:20.145526  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:44:20.145586  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:44:20.170345  420062 cri.go:89] found id: ""
	I1217 20:44:20.170359  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.170367  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:44:20.170377  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:44:20.170387  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:44:20.226476  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:44:20.226496  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:44:20.241970  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:44:20.241987  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:44:20.311525  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:44:20.302109   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.302950   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.304712   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.305374   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.307049   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:44:20.302109   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.302950   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.304712   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.305374   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.307049   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:44:20.311535  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:44:20.311546  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:44:20.375759  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:44:20.375781  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1217 20:44:20.404823  420062 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001239508s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1217 20:44:20.404857  420062 out.go:285] * 
	W1217 20:44:20.404931  420062 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001239508s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1217 20:44:20.404948  420062 out.go:285] * 
	W1217 20:44:20.407052  420062 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1217 20:44:20.412138  420062 out.go:203] 
	W1217 20:44:20.415946  420062 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001239508s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1217 20:44:20.415994  420062 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1217 20:44:20.416018  420062 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1217 20:44:20.419093  420062 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 17 20:44:29 functional-682596 containerd[9792]: time="2025-12-17T20:44:29.485700024Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-682596\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.528847726Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\""
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.532364395Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.541542678Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.562388570Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\" returns successfully"
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.886013953Z" level=info msg="No images store for sha256:426b8c85f3639ce7684f335da56e517a857cd0c0b418e28f3fce1e3079a57b26"
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.888517483Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.896652382Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.896979157Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-682596\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.215090972Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\""
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.217900174Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.221232159Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.234208610Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\" returns successfully"
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.526440562Z" level=info msg="No images store for sha256:426b8c85f3639ce7684f335da56e517a857cd0c0b418e28f3fce1e3079a57b26"
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.528721959Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.537888598Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.538209006Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-682596\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:33 functional-682596 containerd[9792]: time="2025-12-17T20:44:33.567338451Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\""
	Dec 17 20:44:33 functional-682596 containerd[9792]: time="2025-12-17T20:44:33.569906392Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:33 functional-682596 containerd[9792]: time="2025-12-17T20:44:33.572864134Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 17 20:44:33 functional-682596 containerd[9792]: time="2025-12-17T20:44:33.580393179Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\" returns successfully"
	Dec 17 20:44:34 functional-682596 containerd[9792]: time="2025-12-17T20:44:34.405679689Z" level=info msg="No images store for sha256:05371fd6ad950eede907960b388fa9b50b39adf62f93dec0b13c9fc4ce7e1bc1"
	Dec 17 20:44:34 functional-682596 containerd[9792]: time="2025-12-17T20:44:34.408072965Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:34 functional-682596 containerd[9792]: time="2025-12-17T20:44:34.415708801Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:34 functional-682596 containerd[9792]: time="2025-12-17T20:44:34.416196737Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-682596\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:46:47.311714   23504 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:46:47.312868   23504 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:46:47.313758   23504 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:46:47.314559   23504 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:46:47.316025   23504 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec17 17:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015536] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.514164] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034184] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.806183] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.649674] kauditd_printk_skb: 36 callbacks suppressed
	[Dec17 19:37] hrtimer: interrupt took 15014583 ns
	[Dec17 19:39] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:06] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:17] FS-Cache: Duplicate cookie detected
	[  +0.000767] FS-Cache: O-cookie c=00000031 [p=00000002 fl=222 nc=0 na=1]
	[  +0.001036] FS-Cache: O-cookie d=00000000b1f70094{9P.session} n=000000004124fba5
	[  +0.001177] FS-Cache: O-key=[10] '34323937353834383437'
	[  +0.000816] FS-Cache: N-cookie c=00000032 [p=00000002 fl=2 nc=0 na=1]
	[  +0.001043] FS-Cache: N-cookie d=00000000b1f70094{9P.session} n=000000009cece4cf
	[  +0.001160] FS-Cache: N-key=[10] '34323937353834383437'
	
	
	==> kernel <==
	 20:46:47 up  3:29,  0 user,  load average: 0.52, 0.49, 0.54
	Linux functional-682596 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 17 20:46:43 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:46:44 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 512.
	Dec 17 20:46:44 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:44 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:44 functional-682596 kubelet[23386]: E1217 20:46:44.534266   23386 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:46:44 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:46:44 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:46:45 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 513.
	Dec 17 20:46:45 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:45 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:45 functional-682596 kubelet[23392]: E1217 20:46:45.295689   23392 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:46:45 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:46:45 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:46:45 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 514.
	Dec 17 20:46:45 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:45 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:46 functional-682596 kubelet[23398]: E1217 20:46:46.063958   23398 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:46:46 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:46:46 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:46:46 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 515.
	Dec 17 20:46:46 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:46 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:46:46 functional-682596 kubelet[23419]: E1217 20:46:46.800801   23419 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:46:46 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:46:46 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596: exit status 2 (371.096887ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-682596" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect (2.51s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim (241.63s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:50: (dbg) TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1217 20:44:46.603791  369461 retry.go:31] will retry after 3.207373493s: Temporary Error: Get "http://10.106.1.178": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
E1217 20:44:49.014725  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1217 20:44:59.811941  369461 retry.go:31] will retry after 3.473909183s: Temporary Error: Get "http://10.106.1.178": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1217 20:45:13.286837  369461 retry.go:31] will retry after 3.519026694s: Temporary Error: Get "http://10.106.1.178": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1217 20:45:26.807019  369461 retry.go:31] will retry after 11.871014633s: Temporary Error: Get "http://10.106.1.178": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1217 20:45:48.679388  369461 retry.go:31] will retry after 12.098413939s: Temporary Error: Get "http://10.106.1.178": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1217 20:46:10.779043  369461 retry.go:31] will retry after 24.788606947s: Temporary Error: Get "http://10.106.1.178": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
E1217 20:46:28.507041  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
E1217 20:47:52.093523  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: client rate limiter Wait returned an error: context deadline exceeded
functional_test_pvc_test.go:50: ***** TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: pod "integration-test=storage-provisioner" failed to start within 4m0s: context deadline exceeded ****
functional_test_pvc_test.go:50: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596
functional_test_pvc_test.go:50: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596: exit status 2 (328.386145ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
functional_test_pvc_test.go:50: status error: exit status 2 (may be ok)
functional_test_pvc_test.go:50: "functional-682596" apiserver is not running, skipping kubectl commands (state="Stopped")
functional_test_pvc_test.go:51: failed waiting for storage-provisioner: integration-test=storage-provisioner within 4m0s: context deadline exceeded
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-682596
helpers_test.go:244: (dbg) docker inspect functional-682596:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	        "Created": "2025-12-17T20:17:26.774929696Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 408854,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-17T20:17:26.844564666Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:2a6398fc76fc21dc0a77ac54600c2604c101bff52e66ecf65f88ec0f1a8cff2d",
	        "ResolvConfPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hostname",
	        "HostsPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hosts",
	        "LogPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77-json.log",
	        "Name": "/functional-682596",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-682596:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-682596",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	                "LowerDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268-init/diff:/var/lib/docker/overlay2/83c8e6311894730d80a5439b5d4991744e9cfa6d0015df9caca346d57baf92e8/diff",
	                "MergedDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/merged",
	                "UpperDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/diff",
	                "WorkDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-682596",
	                "Source": "/var/lib/docker/volumes/functional-682596/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-682596",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-682596",
	                "name.minikube.sigs.k8s.io": "functional-682596",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "8e0f8d4915f888f90df7adb000bd0e749885d304e33053e85751193487b627b9",
	            "SandboxKey": "/var/run/docker/netns/8e0f8d4915f8",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33163"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33164"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33167"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33165"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33166"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-682596": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "de:95:c1:d9:d4:32",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "9e66e4dbc8284f728f81715f37c51d8272e96fcac9fb378874c982b3077b6cc2",
	                    "EndpointID": "0db3c56cfb2be75c981ed53adcc07de7cd33db60d51c01b0e875c8d41cf02897",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-682596",
	                        "efc9468a7e55"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596: exit status 2 (293.499096ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                       ARGS                                                                        │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh            │ functional-682596 ssh findmnt -T /mount-9p | grep 9p                                                                                              │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │ 17 Dec 25 20:46 UTC │
	│ ssh            │ functional-682596 ssh -- ls -la /mount-9p                                                                                                         │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │ 17 Dec 25 20:46 UTC │
	│ ssh            │ functional-682596 ssh sudo umount -f /mount-9p                                                                                                    │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ mount          │ -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2013412110/001:/mount1 --alsologtostderr -v=1              │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ mount          │ -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2013412110/001:/mount2 --alsologtostderr -v=1              │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ mount          │ -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2013412110/001:/mount3 --alsologtostderr -v=1              │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ ssh            │ functional-682596 ssh findmnt -T /mount1                                                                                                          │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ ssh            │ functional-682596 ssh findmnt -T /mount1                                                                                                          │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │ 17 Dec 25 20:46 UTC │
	│ ssh            │ functional-682596 ssh findmnt -T /mount2                                                                                                          │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │ 17 Dec 25 20:46 UTC │
	│ ssh            │ functional-682596 ssh findmnt -T /mount3                                                                                                          │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │ 17 Dec 25 20:46 UTC │
	│ mount          │ -p functional-682596 --kill=true                                                                                                                  │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ start          │ -p functional-682596 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1 │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ start          │ -p functional-682596 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ start          │ -p functional-682596 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1 │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ dashboard      │ --url --port 36195 -p functional-682596 --alsologtostderr -v=1                                                                                    │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:46 UTC │                     │
	│ update-context │ functional-682596 update-context --alsologtostderr -v=2                                                                                           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:47 UTC │ 17 Dec 25 20:47 UTC │
	│ update-context │ functional-682596 update-context --alsologtostderr -v=2                                                                                           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:47 UTC │ 17 Dec 25 20:47 UTC │
	│ update-context │ functional-682596 update-context --alsologtostderr -v=2                                                                                           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:47 UTC │ 17 Dec 25 20:47 UTC │
	│ image          │ functional-682596 image ls --format short --alsologtostderr                                                                                       │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:47 UTC │ 17 Dec 25 20:47 UTC │
	│ image          │ functional-682596 image ls --format yaml --alsologtostderr                                                                                        │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:47 UTC │ 17 Dec 25 20:47 UTC │
	│ ssh            │ functional-682596 ssh pgrep buildkitd                                                                                                             │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:47 UTC │                     │
	│ image          │ functional-682596 image build -t localhost/my-image:functional-682596 testdata/build --alsologtostderr                                            │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:47 UTC │ 17 Dec 25 20:47 UTC │
	│ image          │ functional-682596 image ls                                                                                                                        │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:47 UTC │ 17 Dec 25 20:47 UTC │
	│ image          │ functional-682596 image ls --format json --alsologtostderr                                                                                        │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:47 UTC │ 17 Dec 25 20:47 UTC │
	│ image          │ functional-682596 image ls --format table --alsologtostderr                                                                                       │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:47 UTC │ 17 Dec 25 20:47 UTC │
	└────────────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/17 20:46:59
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1217 20:46:59.081910  439189 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:46:59.082355  439189 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:46:59.082395  439189 out.go:374] Setting ErrFile to fd 2...
	I1217 20:46:59.082418  439189 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:46:59.082870  439189 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:46:59.083356  439189 out.go:368] Setting JSON to false
	I1217 20:46:59.084244  439189 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":12564,"bootTime":1765991855,"procs":158,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:46:59.084377  439189 start.go:143] virtualization:  
	I1217 20:46:59.087733  439189 out.go:179] * [functional-682596] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	I1217 20:46:59.091444  439189 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 20:46:59.091535  439189 notify.go:221] Checking for updates...
	I1217 20:46:59.097262  439189 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:46:59.100117  439189 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:46:59.102903  439189 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:46:59.105734  439189 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 20:46:59.108516  439189 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 20:46:59.111896  439189 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:46:59.112611  439189 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:46:59.134996  439189 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:46:59.135121  439189 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:46:59.205094  439189 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 20:46:59.195937607 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:46:59.205199  439189 docker.go:319] overlay module found
	I1217 20:46:59.208339  439189 out.go:179] * Utilisation du pilote docker basé sur le profil existant
	I1217 20:46:59.211199  439189 start.go:309] selected driver: docker
	I1217 20:46:59.211235  439189 start.go:927] validating driver "docker" against &{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUI
D:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:46:59.211332  439189 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 20:46:59.214852  439189 out.go:203] 
	W1217 20:46:59.217732  439189 out.go:285] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I1217 20:46:59.220709  439189 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.896652382Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:30 functional-682596 containerd[9792]: time="2025-12-17T20:44:30.896979157Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-682596\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.215090972Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\""
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.217900174Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.221232159Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.234208610Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\" returns successfully"
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.526440562Z" level=info msg="No images store for sha256:426b8c85f3639ce7684f335da56e517a857cd0c0b418e28f3fce1e3079a57b26"
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.528721959Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.537888598Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:32 functional-682596 containerd[9792]: time="2025-12-17T20:44:32.538209006Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-682596\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:33 functional-682596 containerd[9792]: time="2025-12-17T20:44:33.567338451Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\""
	Dec 17 20:44:33 functional-682596 containerd[9792]: time="2025-12-17T20:44:33.569906392Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:33 functional-682596 containerd[9792]: time="2025-12-17T20:44:33.572864134Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 17 20:44:33 functional-682596 containerd[9792]: time="2025-12-17T20:44:33.580393179Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-682596\" returns successfully"
	Dec 17 20:44:34 functional-682596 containerd[9792]: time="2025-12-17T20:44:34.405679689Z" level=info msg="No images store for sha256:05371fd6ad950eede907960b388fa9b50b39adf62f93dec0b13c9fc4ce7e1bc1"
	Dec 17 20:44:34 functional-682596 containerd[9792]: time="2025-12-17T20:44:34.408072965Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:34 functional-682596 containerd[9792]: time="2025-12-17T20:44:34.415708801Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:34 functional-682596 containerd[9792]: time="2025-12-17T20:44:34.416196737Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-682596\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:47:05 functional-682596 containerd[9792]: time="2025-12-17T20:47:05.606008729Z" level=info msg="connecting to shim iz02l13669yfm9mwr47xtve0i" address="unix:///run/containerd/s/cd0f0811091814f416ccf613b4a9ffe48511211b2807667e648732fec5cc49fa" namespace=k8s.io protocol=ttrpc version=3
	Dec 17 20:47:05 functional-682596 containerd[9792]: time="2025-12-17T20:47:05.681199047Z" level=info msg="shim disconnected" id=iz02l13669yfm9mwr47xtve0i namespace=k8s.io
	Dec 17 20:47:05 functional-682596 containerd[9792]: time="2025-12-17T20:47:05.681239417Z" level=info msg="cleaning up after shim disconnected" id=iz02l13669yfm9mwr47xtve0i namespace=k8s.io
	Dec 17 20:47:05 functional-682596 containerd[9792]: time="2025-12-17T20:47:05.681250666Z" level=info msg="cleaning up dead shim" id=iz02l13669yfm9mwr47xtve0i namespace=k8s.io
	Dec 17 20:47:05 functional-682596 containerd[9792]: time="2025-12-17T20:47:05.924748452Z" level=info msg="ImageCreate event name:\"localhost/my-image:functional-682596\""
	Dec 17 20:47:05 functional-682596 containerd[9792]: time="2025-12-17T20:47:05.933535377Z" level=info msg="ImageCreate event name:\"sha256:212d7ac8fb5b3349513f07673a14f9641bff360bca9ba48d3b318caf7f938aad\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:47:05 functional-682596 containerd[9792]: time="2025-12-17T20:47:05.933916176Z" level=info msg="ImageUpdate event name:\"localhost/my-image:functional-682596\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:48:39.685574   25345 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:48:39.686448   25345 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:48:39.688275   25345 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:48:39.688963   25345 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:48:39.690610   25345 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec17 17:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015536] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.514164] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034184] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.806183] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.649674] kauditd_printk_skb: 36 callbacks suppressed
	[Dec17 19:37] hrtimer: interrupt took 15014583 ns
	[Dec17 19:39] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:06] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:17] FS-Cache: Duplicate cookie detected
	[  +0.000767] FS-Cache: O-cookie c=00000031 [p=00000002 fl=222 nc=0 na=1]
	[  +0.001036] FS-Cache: O-cookie d=00000000b1f70094{9P.session} n=000000004124fba5
	[  +0.001177] FS-Cache: O-key=[10] '34323937353834383437'
	[  +0.000816] FS-Cache: N-cookie c=00000032 [p=00000002 fl=2 nc=0 na=1]
	[  +0.001043] FS-Cache: N-cookie d=00000000b1f70094{9P.session} n=000000009cece4cf
	[  +0.001160] FS-Cache: N-key=[10] '34323937353834383437'
	
	
	==> kernel <==
	 20:48:39 up  3:31,  0 user,  load average: 0.30, 0.51, 0.54
	Linux functional-682596 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 17 20:48:36 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:48:37 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 662.
	Dec 17 20:48:37 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:48:37 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:48:37 functional-682596 kubelet[25214]: E1217 20:48:37.283707   25214 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:48:37 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:48:37 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:48:37 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 663.
	Dec 17 20:48:37 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:48:37 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:48:38 functional-682596 kubelet[25220]: E1217 20:48:38.047049   25220 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:48:38 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:48:38 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:48:38 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 664.
	Dec 17 20:48:38 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:48:38 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:48:38 functional-682596 kubelet[25239]: E1217 20:48:38.778219   25239 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:48:38 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:48:38 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:48:39 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 665.
	Dec 17 20:48:39 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:48:39 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:48:39 functional-682596 kubelet[25304]: E1217 20:48:39.551051   25304 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:48:39 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:48:39 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596: exit status 2 (333.407205ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-682596" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim (241.63s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels (3.13s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels
functional_test.go:234: (dbg) Run:  kubectl --context functional-682596 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
functional_test.go:234: (dbg) Non-zero exit: kubectl --context functional-682596 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": exit status 1 (74.496567ms)

                                                
                                                
** stderr ** 
	E1217 20:44:28.662999  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.664501  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.665943  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.667399  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.668843  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:236: failed to 'kubectl get nodes' with args "kubectl --context functional-682596 get nodes --output=go-template \"--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'\"": exit status 1
functional_test.go:242: expected to have label "minikube.k8s.io/commit" in node labels but got : 
** stderr ** 
	E1217 20:44:28.662999  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.664501  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.665943  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.667399  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.668843  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/version" in node labels but got : 
** stderr ** 
	E1217 20:44:28.662999  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.664501  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.665943  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.667399  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.668843  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/updated_at" in node labels but got : 
** stderr ** 
	E1217 20:44:28.662999  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.664501  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.665943  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.667399  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.668843  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/name" in node labels but got : 
** stderr ** 
	E1217 20:44:28.662999  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.664501  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.665943  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.667399  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.668843  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/primary" in node labels but got : 
** stderr ** 
	E1217 20:44:28.662999  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.664501  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.665943  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.667399  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:44:28.668843  433362 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-682596
helpers_test.go:244: (dbg) docker inspect functional-682596:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	        "Created": "2025-12-17T20:17:26.774929696Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 408854,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-17T20:17:26.844564666Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:2a6398fc76fc21dc0a77ac54600c2604c101bff52e66ecf65f88ec0f1a8cff2d",
	        "ResolvConfPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hostname",
	        "HostsPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/hosts",
	        "LogPath": "/var/lib/docker/containers/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77/efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77-json.log",
	        "Name": "/functional-682596",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-682596:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-682596",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "efc9468a7e551914f92ba48a75f43698d6d0bf3671e8866cdeeebc5a6393be77",
	                "LowerDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268-init/diff:/var/lib/docker/overlay2/83c8e6311894730d80a5439b5d4991744e9cfa6d0015df9caca346d57baf92e8/diff",
	                "MergedDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/merged",
	                "UpperDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/diff",
	                "WorkDir": "/var/lib/docker/overlay2/87cdd73f63f42def67677e3949be0b1e0c0455a2f4f85554084b51511cf6b268/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-682596",
	                "Source": "/var/lib/docker/volumes/functional-682596/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-682596",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-682596",
	                "name.minikube.sigs.k8s.io": "functional-682596",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "8e0f8d4915f888f90df7adb000bd0e749885d304e33053e85751193487b627b9",
	            "SandboxKey": "/var/run/docker/netns/8e0f8d4915f8",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33163"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33164"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33167"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33165"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33166"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-682596": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "de:95:c1:d9:d4:32",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "9e66e4dbc8284f728f81715f37c51d8272e96fcac9fb378874c982b3077b6cc2",
	                    "EndpointID": "0db3c56cfb2be75c981ed53adcc07de7cd33db60d51c01b0e875c8d41cf02897",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-682596",
	                        "efc9468a7e55"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-682596 -n functional-682596: exit status 2 (407.481314ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 logs -n 25
helpers_test.go:256: (dbg) Done: out/minikube-linux-arm64 -p functional-682596 logs -n 25: (1.387153808s)
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                             ARGS                                                                             │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh     │ functional-682596 ssh sudo crictl images                                                                                                                     │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ ssh     │ functional-682596 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │ 17 Dec 25 20:31 UTC │
	│ ssh     │ functional-682596 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                      │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:31 UTC │                     │
	│ cache   │ functional-682596 cache reload                                                                                                                               │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ ssh     │ functional-682596 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                      │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                             │ minikube          │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                          │ minikube          │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │ 17 Dec 25 20:32 UTC │
	│ kubectl │ functional-682596 kubectl -- --context functional-682596 get pods                                                                                            │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │                     │
	│ start   │ -p functional-682596 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all                                                     │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:32 UTC │                     │
	│ cp      │ functional-682596 cp testdata/cp-test.txt /home/docker/cp-test.txt                                                                                           │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ config  │ functional-682596 config unset cpus                                                                                                                          │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ config  │ functional-682596 config get cpus                                                                                                                            │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ config  │ functional-682596 config set cpus 2                                                                                                                          │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ config  │ functional-682596 config get cpus                                                                                                                            │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ ssh     │ functional-682596 ssh -n functional-682596 sudo cat /home/docker/cp-test.txt                                                                                 │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ config  │ functional-682596 config unset cpus                                                                                                                          │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ config  │ functional-682596 config get cpus                                                                                                                            │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ license │                                                                                                                                                              │ minikube          │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ cp      │ functional-682596 cp functional-682596:/home/docker/cp-test.txt /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelCpCm1480265787/001/cp-test.txt │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ ssh     │ functional-682596 ssh sudo systemctl is-active docker                                                                                                        │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ ssh     │ functional-682596 ssh -n functional-682596 sudo cat /home/docker/cp-test.txt                                                                                 │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ ssh     │ functional-682596 ssh sudo systemctl is-active crio                                                                                                          │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	│ cp      │ functional-682596 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt                                                                                    │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ ssh     │ functional-682596 ssh -n functional-682596 sudo cat /tmp/does/not/exist/cp-test.txt                                                                          │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │ 17 Dec 25 20:44 UTC │
	│ image   │ functional-682596 image load --daemon kicbase/echo-server:functional-682596 --alsologtostderr                                                                │ functional-682596 │ jenkins │ v1.37.0 │ 17 Dec 25 20:44 UTC │                     │
	└─────────┴──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/17 20:32:06
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1217 20:32:06.395598  420062 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:32:06.395704  420062 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:32:06.395708  420062 out.go:374] Setting ErrFile to fd 2...
	I1217 20:32:06.395712  420062 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:32:06.395972  420062 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:32:06.396388  420062 out.go:368] Setting JSON to false
	I1217 20:32:06.397206  420062 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":11672,"bootTime":1765991855,"procs":154,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:32:06.397266  420062 start.go:143] virtualization:  
	I1217 20:32:06.400889  420062 out.go:179] * [functional-682596] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1217 20:32:06.403953  420062 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 20:32:06.404019  420062 notify.go:221] Checking for updates...
	I1217 20:32:06.410244  420062 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:32:06.413231  420062 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:32:06.416152  420062 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:32:06.419145  420062 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 20:32:06.422186  420062 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 20:32:06.425355  420062 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:32:06.425444  420062 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:32:06.459431  420062 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:32:06.459555  420062 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:32:06.531840  420062 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-17 20:32:06.520070933 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:32:06.531937  420062 docker.go:319] overlay module found
	I1217 20:32:06.535075  420062 out.go:179] * Using the docker driver based on existing profile
	I1217 20:32:06.538013  420062 start.go:309] selected driver: docker
	I1217 20:32:06.538025  420062 start.go:927] validating driver "docker" against &{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableC
oreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:32:06.538123  420062 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 20:32:06.538239  420062 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:32:06.599898  420062 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-17 20:32:06.590438982 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:32:06.600362  420062 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1217 20:32:06.600387  420062 cni.go:84] Creating CNI manager for ""
	I1217 20:32:06.600439  420062 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:32:06.600480  420062 start.go:353] cluster config:
	{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCo
reDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:32:06.605529  420062 out.go:179] * Starting "functional-682596" primary control-plane node in "functional-682596" cluster
	I1217 20:32:06.608314  420062 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1217 20:32:06.611190  420062 out.go:179] * Pulling base image v0.0.48-1765661130-22141 ...
	I1217 20:32:06.614228  420062 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:32:06.614282  420062 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1217 20:32:06.614283  420062 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon
	I1217 20:32:06.614291  420062 cache.go:65] Caching tarball of preloaded images
	I1217 20:32:06.614394  420062 preload.go:238] Found /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1217 20:32:06.614404  420062 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1217 20:32:06.614527  420062 profile.go:143] Saving config to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/config.json ...
	I1217 20:32:06.634867  420062 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon, skipping pull
	I1217 20:32:06.634879  420062 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 exists in daemon, skipping load
	I1217 20:32:06.634892  420062 cache.go:243] Successfully downloaded all kic artifacts
	I1217 20:32:06.634927  420062 start.go:360] acquireMachinesLock for functional-682596: {Name:mk49b95a4c72eb2d15a1ae0f35918a9843d0b3df Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1217 20:32:06.634983  420062 start.go:364] duration metric: took 39.828µs to acquireMachinesLock for "functional-682596"
	I1217 20:32:06.635002  420062 start.go:96] Skipping create...Using existing machine configuration
	I1217 20:32:06.635007  420062 fix.go:54] fixHost starting: 
	I1217 20:32:06.635262  420062 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
	I1217 20:32:06.652755  420062 fix.go:112] recreateIfNeeded on functional-682596: state=Running err=<nil>
	W1217 20:32:06.652776  420062 fix.go:138] unexpected machine state, will restart: <nil>
	I1217 20:32:06.656001  420062 out.go:252] * Updating the running docker "functional-682596" container ...
	I1217 20:32:06.656027  420062 machine.go:94] provisionDockerMachine start ...
	I1217 20:32:06.656117  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:06.673371  420062 main.go:143] libmachine: Using SSH client type: native
	I1217 20:32:06.673711  420062 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:32:06.673717  420062 main.go:143] libmachine: About to run SSH command:
	hostname
	I1217 20:32:06.807817  420062 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:32:06.807832  420062 ubuntu.go:182] provisioning hostname "functional-682596"
	I1217 20:32:06.807905  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:06.825970  420062 main.go:143] libmachine: Using SSH client type: native
	I1217 20:32:06.826266  420062 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:32:06.826274  420062 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-682596 && echo "functional-682596" | sudo tee /etc/hostname
	I1217 20:32:06.965026  420062 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-682596
	
	I1217 20:32:06.965108  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:06.983394  420062 main.go:143] libmachine: Using SSH client type: native
	I1217 20:32:06.983695  420062 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33163 <nil> <nil>}
	I1217 20:32:06.983710  420062 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-682596' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-682596/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-682596' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1217 20:32:07.116833  420062 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1217 20:32:07.116850  420062 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21808-367595/.minikube CaCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21808-367595/.minikube}
	I1217 20:32:07.116869  420062 ubuntu.go:190] setting up certificates
	I1217 20:32:07.116877  420062 provision.go:84] configureAuth start
	I1217 20:32:07.116947  420062 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:32:07.134531  420062 provision.go:143] copyHostCerts
	I1217 20:32:07.134601  420062 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem, removing ...
	I1217 20:32:07.134608  420062 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem
	I1217 20:32:07.134696  420062 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem (1082 bytes)
	I1217 20:32:07.134816  420062 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem, removing ...
	I1217 20:32:07.134820  420062 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem
	I1217 20:32:07.134849  420062 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem (1123 bytes)
	I1217 20:32:07.134907  420062 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem, removing ...
	I1217 20:32:07.134911  420062 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem
	I1217 20:32:07.134937  420062 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem (1679 bytes)
	I1217 20:32:07.134994  420062 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem org=jenkins.functional-682596 san=[127.0.0.1 192.168.49.2 functional-682596 localhost minikube]
	I1217 20:32:07.402222  420062 provision.go:177] copyRemoteCerts
	I1217 20:32:07.402275  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1217 20:32:07.402313  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.421789  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.516787  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1217 20:32:07.535734  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1217 20:32:07.553569  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1217 20:32:07.572193  420062 provision.go:87] duration metric: took 455.301945ms to configureAuth
	I1217 20:32:07.572211  420062 ubuntu.go:206] setting minikube options for container-runtime
	I1217 20:32:07.572513  420062 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:32:07.572520  420062 machine.go:97] duration metric: took 916.488302ms to provisionDockerMachine
	I1217 20:32:07.572527  420062 start.go:293] postStartSetup for "functional-682596" (driver="docker")
	I1217 20:32:07.572544  420062 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1217 20:32:07.572595  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1217 20:32:07.572635  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.593078  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.688373  420062 ssh_runner.go:195] Run: cat /etc/os-release
	I1217 20:32:07.691957  420062 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1217 20:32:07.691978  420062 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1217 20:32:07.691989  420062 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/addons for local assets ...
	I1217 20:32:07.692044  420062 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/files for local assets ...
	I1217 20:32:07.692122  420062 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> 3694612.pem in /etc/ssl/certs
	I1217 20:32:07.692197  420062 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts -> hosts in /etc/test/nested/copy/369461
	I1217 20:32:07.692238  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/369461
	I1217 20:32:07.699873  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:32:07.718147  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts --> /etc/test/nested/copy/369461/hosts (40 bytes)
	I1217 20:32:07.736089  420062 start.go:296] duration metric: took 163.546649ms for postStartSetup
	I1217 20:32:07.736163  420062 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1217 20:32:07.736210  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.753837  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.845496  420062 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1217 20:32:07.850448  420062 fix.go:56] duration metric: took 1.215434362s for fixHost
	I1217 20:32:07.850463  420062 start.go:83] releasing machines lock for "functional-682596", held for 1.215473649s
	I1217 20:32:07.850551  420062 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-682596
	I1217 20:32:07.871450  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:32:07.871498  420062 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:32:07.871505  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:32:07.871531  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:32:07.871602  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:32:07.871627  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:32:07.871680  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:32:07.871748  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:32:07.871798  420062 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
	I1217 20:32:07.889554  420062 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
	I1217 20:32:07.998672  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:32:08.024673  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:32:08.048014  420062 ssh_runner.go:195] Run: openssl version
	I1217 20:32:08.055454  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.065155  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:32:08.073391  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.077720  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.077778  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:32:08.119356  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:32:08.127518  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.135465  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:32:08.143207  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.147322  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.147376  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:08.188376  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:32:08.196028  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.203401  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:32:08.211111  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.214821  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.214891  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:32:08.256072  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:32:08.263331  420062 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-certificates >/dev/null 2>&1 && sudo update-ca-certificates || true"
	I1217 20:32:08.266724  420062 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-trust >/dev/null 2>&1 && sudo update-ca-trust extract || true"
	I1217 20:32:08.270040  420062 ssh_runner.go:195] Run: cat /version.json
	I1217 20:32:08.270111  420062 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1217 20:32:08.361093  420062 ssh_runner.go:195] Run: systemctl --version
	I1217 20:32:08.367706  420062 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1217 20:32:08.372063  420062 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1217 20:32:08.372127  420062 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1217 20:32:08.380119  420062 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1217 20:32:08.380133  420062 start.go:496] detecting cgroup driver to use...
	I1217 20:32:08.380163  420062 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1217 20:32:08.380223  420062 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1217 20:32:08.395765  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1217 20:32:08.409064  420062 docker.go:218] disabling cri-docker service (if available) ...
	I1217 20:32:08.409142  420062 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1217 20:32:08.425141  420062 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1217 20:32:08.438808  420062 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1217 20:32:08.558555  420062 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1217 20:32:08.681937  420062 docker.go:234] disabling docker service ...
	I1217 20:32:08.681997  420062 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1217 20:32:08.701323  420062 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1217 20:32:08.715923  420062 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1217 20:32:08.835610  420062 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1217 20:32:08.958372  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1217 20:32:08.972822  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1217 20:32:08.987570  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1217 20:32:08.997169  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1217 20:32:09.008742  420062 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1217 20:32:09.008821  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1217 20:32:09.018997  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:32:09.028318  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1217 20:32:09.037280  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 20:32:09.046375  420062 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1217 20:32:09.054925  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1217 20:32:09.064191  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1217 20:32:09.073303  420062 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1217 20:32:09.082553  420062 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1217 20:32:09.090003  420062 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1217 20:32:09.097524  420062 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:32:09.216967  420062 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1217 20:32:09.360558  420062 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1217 20:32:09.360617  420062 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1217 20:32:09.364443  420062 start.go:564] Will wait 60s for crictl version
	I1217 20:32:09.364497  420062 ssh_runner.go:195] Run: which crictl
	I1217 20:32:09.368129  420062 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1217 20:32:09.397262  420062 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1217 20:32:09.397334  420062 ssh_runner.go:195] Run: containerd --version
	I1217 20:32:09.420778  420062 ssh_runner.go:195] Run: containerd --version
	I1217 20:32:09.446347  420062 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1217 20:32:09.449338  420062 cli_runner.go:164] Run: docker network inspect functional-682596 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1217 20:32:09.466521  420062 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1217 20:32:09.473221  420062 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1217 20:32:09.476024  420062 kubeadm.go:884] updating cluster {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMi
rror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1217 20:32:09.476173  420062 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:32:09.476285  420062 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:32:09.523837  420062 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:32:09.523848  420062 containerd.go:534] Images already preloaded, skipping extraction
	I1217 20:32:09.523905  420062 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 20:32:09.551003  420062 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 20:32:09.551014  420062 cache_images.go:86] Images are preloaded, skipping loading
	I1217 20:32:09.551021  420062 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1217 20:32:09.551143  420062 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-682596 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1217 20:32:09.551208  420062 ssh_runner.go:195] Run: sudo crictl info
	I1217 20:32:09.578643  420062 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1217 20:32:09.578665  420062 cni.go:84] Creating CNI manager for ""
	I1217 20:32:09.578673  420062 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:32:09.578683  420062 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1217 20:32:09.578707  420062 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-682596 NodeName:functional-682596 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubelet
ConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1217 20:32:09.578827  420062 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-682596"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1217 20:32:09.578904  420062 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1217 20:32:09.586879  420062 binaries.go:51] Found k8s binaries, skipping transfer
	I1217 20:32:09.586939  420062 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1217 20:32:09.594505  420062 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1217 20:32:09.607281  420062 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1217 20:32:09.619808  420062 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2085 bytes)
	I1217 20:32:09.632685  420062 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1217 20:32:09.636364  420062 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 20:32:09.746796  420062 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1217 20:32:10.238623  420062 certs.go:69] Setting up /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596 for IP: 192.168.49.2
	I1217 20:32:10.238634  420062 certs.go:195] generating shared ca certs ...
	I1217 20:32:10.238650  420062 certs.go:227] acquiring lock for ca certs: {Name:mk528c7ee25f2f3d78de33f266a77f908cb2a9d0 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 20:32:10.238819  420062 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key
	I1217 20:32:10.238897  420062 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key
	I1217 20:32:10.238904  420062 certs.go:257] generating profile certs ...
	I1217 20:32:10.238995  420062 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.key
	I1217 20:32:10.239044  420062 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key.0c30bf8d
	I1217 20:32:10.239082  420062 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key
	I1217 20:32:10.239190  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 20:32:10.239221  420062 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 20:32:10.239227  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 20:32:10.239261  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 20:32:10.239282  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 20:32:10.239304  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 20:32:10.239345  420062 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 20:32:10.239934  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1217 20:32:10.261870  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1217 20:32:10.286466  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1217 20:32:10.307033  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1217 20:32:10.325172  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1217 20:32:10.343499  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1217 20:32:10.361814  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1217 20:32:10.379595  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1217 20:32:10.397590  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 20:32:10.415855  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 20:32:10.435021  420062 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 20:32:10.453267  420062 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1217 20:32:10.466474  420062 ssh_runner.go:195] Run: openssl version
	I1217 20:32:10.472863  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.480366  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 20:32:10.487904  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.491724  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.491791  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 20:32:10.533110  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 20:32:10.540758  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.548093  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 20:32:10.555384  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.558983  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.559039  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 20:32:10.602447  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 20:32:10.609962  420062 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.617251  420062 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 20:32:10.625102  420062 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.629186  420062 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.629244  420062 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 20:32:10.670572  420062 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 20:32:10.678295  420062 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1217 20:32:10.682347  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1217 20:32:10.723286  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1217 20:32:10.764614  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1217 20:32:10.806369  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1217 20:32:10.856829  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1217 20:32:10.900136  420062 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1217 20:32:10.941380  420062 kubeadm.go:401] StartCluster: {Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirro
r: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:32:10.941458  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1217 20:32:10.941532  420062 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1217 20:32:10.973304  420062 cri.go:89] found id: ""
	I1217 20:32:10.973369  420062 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1217 20:32:10.981213  420062 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1217 20:32:10.981233  420062 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1217 20:32:10.981284  420062 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1217 20:32:10.989643  420062 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:10.990148  420062 kubeconfig.go:125] found "functional-682596" server: "https://192.168.49.2:8441"
	I1217 20:32:10.991404  420062 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1217 20:32:11.001770  420062 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-17 20:17:35.203485302 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-17 20:32:09.624537089 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1217 20:32:11.001793  420062 kubeadm.go:1161] stopping kube-system containers ...
	I1217 20:32:11.001810  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1217 20:32:11.001907  420062 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1217 20:32:11.031815  420062 cri.go:89] found id: ""
	I1217 20:32:11.031894  420062 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1217 20:32:11.052689  420062 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 20:32:11.061497  420062 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5635 Dec 17 20:21 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5640 Dec 17 20:21 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Dec 17 20:21 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Dec 17 20:21 /etc/kubernetes/scheduler.conf
	
	I1217 20:32:11.061561  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1217 20:32:11.069861  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1217 20:32:11.077903  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:11.077964  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 20:32:11.085969  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1217 20:32:11.094098  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:11.094177  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 20:32:11.102002  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1217 20:32:11.110213  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1217 20:32:11.110288  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 20:32:11.119148  420062 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1217 20:32:11.127567  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:11.176595  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.173518  420062 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.996897383s)
	I1217 20:32:13.173578  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.380045  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.450955  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1217 20:32:13.494559  420062 api_server.go:52] waiting for apiserver process to appear ...
	I1217 20:32:13.494629  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:13.995499  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:14.495246  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:14.995004  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:15.494932  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:15.995036  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:16.495074  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:16.994872  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:17.495380  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:17.995751  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:18.495343  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:18.994970  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:19.494770  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:19.994830  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:20.495505  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:20.994898  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:21.495023  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:21.995478  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:22.495349  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:22.995690  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:23.495439  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:23.995543  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:24.495694  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:24.995422  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:25.495295  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:25.994704  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:26.495710  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:26.995337  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:27.494832  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:27.995523  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:28.494851  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:28.995537  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:29.495464  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:29.994938  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:30.494723  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:30.995506  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:31.494922  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:31.995021  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:32.495513  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:32.995616  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:33.494819  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:33.995255  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:34.495487  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:34.994841  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:35.494829  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:35.994738  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:36.495064  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:36.995222  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:37.495670  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:37.995598  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:38.495022  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:38.994778  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:39.494800  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:39.995546  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:40.495339  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:40.995490  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:41.495730  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:41.995344  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:42.494837  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:42.994782  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:43.495499  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:43.994789  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:44.495147  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:44.994920  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:45.495463  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:45.994922  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:46.495042  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:46.994829  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:47.495629  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:47.994850  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:48.495359  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:48.994705  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:49.494785  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:49.995746  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:50.495699  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:50.994838  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:51.494890  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:51.995223  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:52.495608  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:52.995342  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:53.495633  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:53.994828  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:54.495690  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:54.995411  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:55.495390  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:55.994857  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:56.494814  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:56.995195  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:57.494792  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:57.995068  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:58.494828  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:58.995135  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:59.495101  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:32:59.994696  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:00.494847  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:00.994832  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:01.495150  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:01.994869  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:02.494983  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:02.995441  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:03.495150  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:03.994800  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:04.494955  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:04.995595  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:05.495571  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:05.995745  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:06.494913  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:06.994802  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:07.494809  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:07.995731  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:08.495034  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:08.995352  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:09.494830  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:09.995574  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:10.495663  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:10.995478  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:11.494754  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:11.995704  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:12.494787  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:12.995364  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:13.495637  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:13.495716  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:13.520703  420062 cri.go:89] found id: ""
	I1217 20:33:13.520717  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.520724  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:13.520729  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:13.520793  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:13.549658  420062 cri.go:89] found id: ""
	I1217 20:33:13.549672  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.549680  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:13.549685  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:13.549748  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:13.574860  420062 cri.go:89] found id: ""
	I1217 20:33:13.574873  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.574880  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:13.574885  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:13.574945  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:13.602159  420062 cri.go:89] found id: ""
	I1217 20:33:13.602173  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.602180  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:13.602185  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:13.602244  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:13.625735  420062 cri.go:89] found id: ""
	I1217 20:33:13.625748  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.625755  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:13.625760  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:13.625816  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:13.650446  420062 cri.go:89] found id: ""
	I1217 20:33:13.650460  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.650468  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:13.650473  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:13.650533  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:13.677915  420062 cri.go:89] found id: ""
	I1217 20:33:13.677929  420062 logs.go:282] 0 containers: []
	W1217 20:33:13.677936  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:13.677944  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:13.677954  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:13.692434  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:13.692449  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:13.767790  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:13.758832   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.759470   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.761607   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.762393   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.763960   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:13.758832   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.759470   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.761607   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.762393   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:13.763960   10858 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:13.767810  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:13.767820  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:13.839665  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:13.839685  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:13.872573  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:13.872589  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:16.429115  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:16.438989  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:16.439051  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:16.466518  420062 cri.go:89] found id: ""
	I1217 20:33:16.466532  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.466539  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:16.466545  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:16.466602  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:16.492200  420062 cri.go:89] found id: ""
	I1217 20:33:16.492213  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.492221  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:16.492226  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:16.492302  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:16.517055  420062 cri.go:89] found id: ""
	I1217 20:33:16.517070  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.517083  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:16.517088  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:16.517148  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:16.552138  420062 cri.go:89] found id: ""
	I1217 20:33:16.552152  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.552159  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:16.552165  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:16.552235  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:16.577184  420062 cri.go:89] found id: ""
	I1217 20:33:16.577198  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.577214  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:16.577220  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:16.577279  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:16.602039  420062 cri.go:89] found id: ""
	I1217 20:33:16.602053  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.602060  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:16.602066  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:16.602124  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:16.626732  420062 cri.go:89] found id: ""
	I1217 20:33:16.626745  420062 logs.go:282] 0 containers: []
	W1217 20:33:16.626752  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:16.626760  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:16.626770  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:16.689454  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:16.689473  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:16.722345  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:16.722363  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:16.784686  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:16.784705  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:16.801895  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:16.801911  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:16.865697  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:16.856899   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.857554   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859279   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859924   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.861707   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:16.856899   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.857554   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859279   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.859924   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:16.861707   10985 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:19.365915  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:19.375998  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:19.376066  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:19.399955  420062 cri.go:89] found id: ""
	I1217 20:33:19.399968  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.399976  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:19.399981  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:19.400039  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:19.424668  420062 cri.go:89] found id: ""
	I1217 20:33:19.424682  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.424689  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:19.424695  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:19.424755  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:19.449865  420062 cri.go:89] found id: ""
	I1217 20:33:19.449879  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.449886  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:19.449891  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:19.449958  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:19.474803  420062 cri.go:89] found id: ""
	I1217 20:33:19.474816  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.474833  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:19.474838  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:19.474909  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:19.503551  420062 cri.go:89] found id: ""
	I1217 20:33:19.503579  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.503598  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:19.503603  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:19.503687  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:19.529232  420062 cri.go:89] found id: ""
	I1217 20:33:19.529246  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.529259  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:19.529264  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:19.529330  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:19.554443  420062 cri.go:89] found id: ""
	I1217 20:33:19.554456  420062 logs.go:282] 0 containers: []
	W1217 20:33:19.554463  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:19.554481  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:19.554491  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:19.609391  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:19.609411  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:19.625653  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:19.625669  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:19.691445  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:19.683737   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.684184   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.685768   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.686180   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.687608   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:19.683737   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.684184   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.685768   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.686180   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:19.687608   11071 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:19.691456  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:19.691466  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:19.754663  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:19.754682  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:22.297725  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:22.309139  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:22.309199  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:22.334369  420062 cri.go:89] found id: ""
	I1217 20:33:22.334382  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.334390  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:22.334395  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:22.334458  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:22.363418  420062 cri.go:89] found id: ""
	I1217 20:33:22.363445  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.363453  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:22.363458  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:22.363531  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:22.388924  420062 cri.go:89] found id: ""
	I1217 20:33:22.388939  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.388947  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:22.388993  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:22.389056  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:22.415757  420062 cri.go:89] found id: ""
	I1217 20:33:22.415780  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.415787  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:22.415793  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:22.415872  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:22.441520  420062 cri.go:89] found id: ""
	I1217 20:33:22.441534  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.441541  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:22.441546  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:22.441605  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:22.480775  420062 cri.go:89] found id: ""
	I1217 20:33:22.480789  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.480795  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:22.480801  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:22.480873  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:22.505556  420062 cri.go:89] found id: ""
	I1217 20:33:22.505570  420062 logs.go:282] 0 containers: []
	W1217 20:33:22.505577  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:22.505585  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:22.505596  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:22.562036  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:22.562054  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:22.577369  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:22.577386  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:22.647423  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:22.638838   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.639486   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641272   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641956   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.643602   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:22.638838   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.639486   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641272   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.641956   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:22.643602   11175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:22.647453  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:22.647464  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:22.710153  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:22.710173  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:25.239783  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:25.250945  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:25.251006  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:25.277422  420062 cri.go:89] found id: ""
	I1217 20:33:25.277435  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.277443  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:25.277448  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:25.277510  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:25.303032  420062 cri.go:89] found id: ""
	I1217 20:33:25.303051  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.303063  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:25.303070  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:25.303176  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:25.333183  420062 cri.go:89] found id: ""
	I1217 20:33:25.333197  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.333204  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:25.333209  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:25.333272  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:25.358899  420062 cri.go:89] found id: ""
	I1217 20:33:25.358913  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.358920  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:25.358926  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:25.358986  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:25.388611  420062 cri.go:89] found id: ""
	I1217 20:33:25.388625  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.388633  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:25.388638  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:25.388704  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:25.415829  420062 cri.go:89] found id: ""
	I1217 20:33:25.415844  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.415852  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:25.415857  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:25.415913  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:25.442921  420062 cri.go:89] found id: ""
	I1217 20:33:25.442935  420062 logs.go:282] 0 containers: []
	W1217 20:33:25.442941  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:25.442949  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:25.442965  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:25.459113  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:25.459135  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:25.535629  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:25.526636   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.527172   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.528838   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.529443   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.530989   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:25.526636   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.527172   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.528838   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.529443   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:25.530989   11279 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:25.535645  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:25.535655  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:25.601950  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:25.601968  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:25.634192  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:25.634208  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:28.190569  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:28.200504  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:28.200563  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:28.224311  420062 cri.go:89] found id: ""
	I1217 20:33:28.224325  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.224332  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:28.224338  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:28.224396  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:28.252603  420062 cri.go:89] found id: ""
	I1217 20:33:28.252622  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.252629  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:28.252634  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:28.252692  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:28.276684  420062 cri.go:89] found id: ""
	I1217 20:33:28.276697  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.276704  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:28.276709  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:28.276777  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:28.299922  420062 cri.go:89] found id: ""
	I1217 20:33:28.299935  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.299942  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:28.299947  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:28.300014  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:28.326124  420062 cri.go:89] found id: ""
	I1217 20:33:28.326137  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.326144  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:28.326150  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:28.326218  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:28.349497  420062 cri.go:89] found id: ""
	I1217 20:33:28.349510  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.349517  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:28.349523  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:28.349579  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:28.378156  420062 cri.go:89] found id: ""
	I1217 20:33:28.378170  420062 logs.go:282] 0 containers: []
	W1217 20:33:28.378177  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:28.378185  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:28.378194  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:28.434254  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:28.434274  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:28.448810  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:28.448837  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:28.521268  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:28.512905   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.513656   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515366   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515890   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.517404   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:28.512905   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.513656   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515366   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.515890   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:28.517404   11384 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:28.521279  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:28.521290  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:28.584201  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:28.584222  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:31.112699  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:31.123315  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:31.123377  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:31.151761  420062 cri.go:89] found id: ""
	I1217 20:33:31.151776  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.151783  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:31.151789  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:31.151849  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:31.177165  420062 cri.go:89] found id: ""
	I1217 20:33:31.177178  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.177186  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:31.177191  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:31.177262  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:31.205229  420062 cri.go:89] found id: ""
	I1217 20:33:31.205260  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.205267  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:31.205272  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:31.205341  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:31.229570  420062 cri.go:89] found id: ""
	I1217 20:33:31.229584  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.229591  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:31.229597  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:31.229673  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:31.258880  420062 cri.go:89] found id: ""
	I1217 20:33:31.258904  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.258911  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:31.258917  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:31.258983  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:31.286222  420062 cri.go:89] found id: ""
	I1217 20:33:31.286241  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.286248  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:31.286253  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:31.286315  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:31.311291  420062 cri.go:89] found id: ""
	I1217 20:33:31.311314  420062 logs.go:282] 0 containers: []
	W1217 20:33:31.311322  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:31.311330  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:31.311340  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:31.342524  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:31.342541  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:31.398421  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:31.398440  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:31.413476  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:31.413497  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:31.478376  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:31.469734   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.470537   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472118   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472657   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.474358   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:31.469734   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.470537   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472118   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.472657   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:31.474358   11501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:31.478388  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:31.478398  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:34.044394  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:34.054571  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:34.054632  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:34.078791  420062 cri.go:89] found id: ""
	I1217 20:33:34.078815  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.078822  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:34.078827  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:34.078902  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:34.103484  420062 cri.go:89] found id: ""
	I1217 20:33:34.103498  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.103505  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:34.103510  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:34.103578  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:34.128330  420062 cri.go:89] found id: ""
	I1217 20:33:34.128343  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.128362  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:34.128368  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:34.128436  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:34.156115  420062 cri.go:89] found id: ""
	I1217 20:33:34.156129  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.156136  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:34.156141  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:34.156208  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:34.179862  420062 cri.go:89] found id: ""
	I1217 20:33:34.179876  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.179884  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:34.179889  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:34.179959  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:34.205717  420062 cri.go:89] found id: ""
	I1217 20:33:34.205731  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.205739  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:34.205745  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:34.205804  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:34.230674  420062 cri.go:89] found id: ""
	I1217 20:33:34.230689  420062 logs.go:282] 0 containers: []
	W1217 20:33:34.230702  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:34.230710  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:34.230720  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:34.286930  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:34.286949  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:34.301786  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:34.301803  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:34.365439  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:34.357724   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.358190   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.359660   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.360034   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.361429   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:34.357724   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.358190   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.359660   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.360034   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:34.361429   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:34.365461  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:34.365473  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:34.426703  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:34.426724  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:36.954941  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:36.964889  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:36.964949  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:37.000981  420062 cri.go:89] found id: ""
	I1217 20:33:37.000999  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.001008  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:37.001014  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:37.001098  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:37.036987  420062 cri.go:89] found id: ""
	I1217 20:33:37.037001  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.037008  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:37.037013  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:37.037083  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:37.067078  420062 cri.go:89] found id: ""
	I1217 20:33:37.067092  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.067099  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:37.067105  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:37.067173  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:37.101494  420062 cri.go:89] found id: ""
	I1217 20:33:37.101509  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.101516  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:37.101522  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:37.101582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:37.125577  420062 cri.go:89] found id: ""
	I1217 20:33:37.125591  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.125599  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:37.125604  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:37.125672  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:37.155006  420062 cri.go:89] found id: ""
	I1217 20:33:37.155022  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.155040  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:37.155045  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:37.155105  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:37.180061  420062 cri.go:89] found id: ""
	I1217 20:33:37.180075  420062 logs.go:282] 0 containers: []
	W1217 20:33:37.180082  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:37.180090  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:37.180110  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:37.235716  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:37.235744  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:37.250676  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:37.250704  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:37.314789  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:37.307219   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.307729   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309210   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309555   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.311019   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:37.307219   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.307729   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309210   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.309555   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:37.311019   11700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:37.314799  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:37.314811  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:37.376546  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:37.376566  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:39.904036  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:39.914146  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:39.914209  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:39.942353  420062 cri.go:89] found id: ""
	I1217 20:33:39.942366  420062 logs.go:282] 0 containers: []
	W1217 20:33:39.942374  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:39.942379  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:39.942445  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:39.970090  420062 cri.go:89] found id: ""
	I1217 20:33:39.970105  420062 logs.go:282] 0 containers: []
	W1217 20:33:39.970113  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:39.970119  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:39.970185  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:40.013204  420062 cri.go:89] found id: ""
	I1217 20:33:40.013220  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.013228  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:40.013234  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:40.013312  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:40.055438  420062 cri.go:89] found id: ""
	I1217 20:33:40.055453  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.055461  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:40.055467  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:40.055532  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:40.088240  420062 cri.go:89] found id: ""
	I1217 20:33:40.088285  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.088293  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:40.088298  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:40.088361  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:40.116666  420062 cri.go:89] found id: ""
	I1217 20:33:40.116680  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.116687  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:40.116693  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:40.116752  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:40.143935  420062 cri.go:89] found id: ""
	I1217 20:33:40.143951  420062 logs.go:282] 0 containers: []
	W1217 20:33:40.143965  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:40.143973  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:40.143986  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:40.199464  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:40.199484  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:40.214665  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:40.214682  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:40.285603  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:40.277391   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.277927   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.279526   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.280079   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.281668   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:40.277391   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.277927   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.279526   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.280079   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:40.281668   11808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:40.285613  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:40.285623  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:40.348551  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:40.348571  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:42.882366  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:42.892346  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:42.892407  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:42.917526  420062 cri.go:89] found id: ""
	I1217 20:33:42.917540  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.917548  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:42.917553  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:42.917622  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:42.941649  420062 cri.go:89] found id: ""
	I1217 20:33:42.941663  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.941670  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:42.941675  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:42.941737  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:42.965314  420062 cri.go:89] found id: ""
	I1217 20:33:42.965328  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.965335  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:42.965341  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:42.965399  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:42.992861  420062 cri.go:89] found id: ""
	I1217 20:33:42.992875  420062 logs.go:282] 0 containers: []
	W1217 20:33:42.992882  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:42.992888  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:42.992949  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:43.026962  420062 cri.go:89] found id: ""
	I1217 20:33:43.026977  420062 logs.go:282] 0 containers: []
	W1217 20:33:43.026984  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:43.026989  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:43.027048  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:43.056268  420062 cri.go:89] found id: ""
	I1217 20:33:43.056282  420062 logs.go:282] 0 containers: []
	W1217 20:33:43.056289  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:43.056295  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:43.056353  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:43.088527  420062 cri.go:89] found id: ""
	I1217 20:33:43.088542  420062 logs.go:282] 0 containers: []
	W1217 20:33:43.088549  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:43.088556  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:43.088567  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:43.115028  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:43.115044  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:43.170239  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:43.170258  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:43.185453  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:43.185468  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:43.255155  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:43.247293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.247760   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249636   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.251132   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:43.247293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.247760   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249293   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.249636   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:43.251132   11923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:43.255166  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:43.255176  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:45.818750  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:45.829020  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:45.829084  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:45.854296  420062 cri.go:89] found id: ""
	I1217 20:33:45.854310  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.854319  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:45.854327  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:45.854393  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:45.884706  420062 cri.go:89] found id: ""
	I1217 20:33:45.884720  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.884728  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:45.884733  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:45.884795  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:45.909518  420062 cri.go:89] found id: ""
	I1217 20:33:45.909533  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.909540  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:45.909545  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:45.909615  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:45.935050  420062 cri.go:89] found id: ""
	I1217 20:33:45.935065  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.935073  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:45.935078  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:45.935155  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:45.964622  420062 cri.go:89] found id: ""
	I1217 20:33:45.964636  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.964643  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:45.964648  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:45.964714  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:45.992340  420062 cri.go:89] found id: ""
	I1217 20:33:45.992355  420062 logs.go:282] 0 containers: []
	W1217 20:33:45.992363  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:45.992368  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:45.992432  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:46.029800  420062 cri.go:89] found id: ""
	I1217 20:33:46.029815  420062 logs.go:282] 0 containers: []
	W1217 20:33:46.029822  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:46.029841  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:46.029852  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:46.096203  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:46.096224  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:46.111499  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:46.111517  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:46.174259  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:46.165992   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.166754   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168484   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168848   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.170379   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:46.165992   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.166754   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168484   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.168848   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:46.170379   12014 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:46.174269  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:46.174282  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:46.239891  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:46.239911  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:48.769726  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:48.779731  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:48.779796  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:48.803697  420062 cri.go:89] found id: ""
	I1217 20:33:48.803710  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.803718  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:48.803723  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:48.803790  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:48.828947  420062 cri.go:89] found id: ""
	I1217 20:33:48.828966  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.828974  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:48.828979  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:48.829045  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:48.853794  420062 cri.go:89] found id: ""
	I1217 20:33:48.853809  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.853815  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:48.853821  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:48.853884  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:48.879220  420062 cri.go:89] found id: ""
	I1217 20:33:48.879234  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.879241  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:48.879253  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:48.879316  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:48.905546  420062 cri.go:89] found id: ""
	I1217 20:33:48.905560  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.905567  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:48.905573  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:48.905639  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:48.931025  420062 cri.go:89] found id: ""
	I1217 20:33:48.931040  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.931047  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:48.931053  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:48.931111  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:48.959554  420062 cri.go:89] found id: ""
	I1217 20:33:48.959567  420062 logs.go:282] 0 containers: []
	W1217 20:33:48.959575  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:48.959591  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:48.959603  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:49.037548  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:49.028333   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.029097   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.030655   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.031218   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.033613   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:49.028333   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.029097   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.030655   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.031218   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:49.033613   12106 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:49.037558  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:49.037576  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:49.104606  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:49.104628  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:49.132120  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:49.132142  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:49.189781  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:49.189799  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:51.705313  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:51.715310  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:51.715375  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:51.742788  420062 cri.go:89] found id: ""
	I1217 20:33:51.742803  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.742810  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:51.742816  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:51.742878  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:51.768132  420062 cri.go:89] found id: ""
	I1217 20:33:51.768147  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.768154  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:51.768160  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:51.768220  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:51.796803  420062 cri.go:89] found id: ""
	I1217 20:33:51.796817  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.796825  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:51.796831  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:51.796891  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:51.823032  420062 cri.go:89] found id: ""
	I1217 20:33:51.823046  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.823054  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:51.823061  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:51.823122  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:51.848750  420062 cri.go:89] found id: ""
	I1217 20:33:51.848765  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.848773  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:51.848778  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:51.848840  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:51.874494  420062 cri.go:89] found id: ""
	I1217 20:33:51.874509  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.874516  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:51.874522  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:51.874582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:51.912240  420062 cri.go:89] found id: ""
	I1217 20:33:51.912273  420062 logs.go:282] 0 containers: []
	W1217 20:33:51.912281  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:51.912290  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:51.912301  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:51.940881  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:51.940897  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:51.997574  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:51.997596  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:52.016000  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:52.016018  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:52.093264  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:52.084701   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.085399   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087055   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087666   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.089311   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:52.084701   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.085399   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087055   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.087666   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:52.089311   12237 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:52.093274  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:52.093286  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:54.657449  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:54.667679  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:54.667741  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:54.696106  420062 cri.go:89] found id: ""
	I1217 20:33:54.696121  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.696128  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:54.696133  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:54.696194  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:54.720578  420062 cri.go:89] found id: ""
	I1217 20:33:54.720592  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.720599  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:54.720605  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:54.720669  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:54.746036  420062 cri.go:89] found id: ""
	I1217 20:33:54.746050  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.746058  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:54.746063  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:54.746122  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:54.770192  420062 cri.go:89] found id: ""
	I1217 20:33:54.770206  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.770213  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:54.770219  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:54.770275  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:54.794365  420062 cri.go:89] found id: ""
	I1217 20:33:54.794379  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.794386  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:54.794391  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:54.794454  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:54.818424  420062 cri.go:89] found id: ""
	I1217 20:33:54.818438  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.818446  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:54.818451  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:54.818513  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:54.843360  420062 cri.go:89] found id: ""
	I1217 20:33:54.843375  420062 logs.go:282] 0 containers: []
	W1217 20:33:54.843382  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:54.843401  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:54.843412  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:33:54.872684  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:54.872701  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:54.928831  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:54.928851  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:54.943545  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:54.943561  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:55.020697  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:55.008146   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.009058   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.010012   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011180   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011994   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:55.008146   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.009058   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.010012   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011180   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:55.011994   12336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:55.020721  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:55.020734  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:57.590507  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:33:57.600840  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:33:57.600911  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:33:57.628650  420062 cri.go:89] found id: ""
	I1217 20:33:57.628664  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.628671  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:33:57.628676  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:33:57.628736  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:33:57.653915  420062 cri.go:89] found id: ""
	I1217 20:33:57.653929  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.653936  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:33:57.653941  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:33:57.654005  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:33:57.677881  420062 cri.go:89] found id: ""
	I1217 20:33:57.677894  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.677901  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:33:57.677906  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:33:57.677974  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:33:57.701808  420062 cri.go:89] found id: ""
	I1217 20:33:57.701823  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.701830  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:33:57.701836  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:33:57.701894  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:33:57.725682  420062 cri.go:89] found id: ""
	I1217 20:33:57.725696  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.725703  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:33:57.725708  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:33:57.725770  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:33:57.753864  420062 cri.go:89] found id: ""
	I1217 20:33:57.753878  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.753885  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:33:57.753891  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:33:57.753948  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:33:57.779180  420062 cri.go:89] found id: ""
	I1217 20:33:57.779193  420062 logs.go:282] 0 containers: []
	W1217 20:33:57.779200  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:33:57.779216  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:33:57.779227  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:33:57.834554  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:33:57.834575  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:33:57.849468  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:33:57.849484  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:33:57.917796  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:33:57.910011   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.910781   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912353   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912882   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.913951   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:33:57.910011   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.910781   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912353   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.912882   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:33:57.913951   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:33:57.917816  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:33:57.917827  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:33:57.980535  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:33:57.980556  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:00.519246  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:00.531028  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:00.531090  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:00.557919  420062 cri.go:89] found id: ""
	I1217 20:34:00.557933  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.557941  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:00.557947  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:00.558006  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:00.583357  420062 cri.go:89] found id: ""
	I1217 20:34:00.583381  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.583389  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:00.583394  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:00.583461  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:00.608300  420062 cri.go:89] found id: ""
	I1217 20:34:00.608313  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.608321  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:00.608326  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:00.608396  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:00.633249  420062 cri.go:89] found id: ""
	I1217 20:34:00.633263  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.633271  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:00.633277  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:00.633354  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:00.657998  420062 cri.go:89] found id: ""
	I1217 20:34:00.658012  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.658020  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:00.658025  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:00.658083  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:00.686479  420062 cri.go:89] found id: ""
	I1217 20:34:00.686494  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.686502  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:00.686517  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:00.686600  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:00.715237  420062 cri.go:89] found id: ""
	I1217 20:34:00.715251  420062 logs.go:282] 0 containers: []
	W1217 20:34:00.715259  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:00.715281  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:00.715297  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:00.771736  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:00.771756  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:00.786569  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:00.786584  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:00.855532  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:00.846820   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.847617   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849290   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849821   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.851435   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:00.846820   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.847617   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849290   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.849821   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:00.851435   12534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:00.855544  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:00.855556  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:00.929889  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:00.929917  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:03.457778  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:03.467767  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:03.467830  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:03.491745  420062 cri.go:89] found id: ""
	I1217 20:34:03.491760  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.491767  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:03.491772  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:03.491834  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:03.516486  420062 cri.go:89] found id: ""
	I1217 20:34:03.516501  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.516508  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:03.516514  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:03.516573  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:03.545504  420062 cri.go:89] found id: ""
	I1217 20:34:03.545518  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.545526  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:03.545531  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:03.545592  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:03.570752  420062 cri.go:89] found id: ""
	I1217 20:34:03.570766  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.570773  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:03.570779  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:03.570837  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:03.599464  420062 cri.go:89] found id: ""
	I1217 20:34:03.599478  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.599486  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:03.599491  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:03.599551  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:03.626193  420062 cri.go:89] found id: ""
	I1217 20:34:03.626209  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.626217  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:03.626222  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:03.626280  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:03.650682  420062 cri.go:89] found id: ""
	I1217 20:34:03.650696  420062 logs.go:282] 0 containers: []
	W1217 20:34:03.650704  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:03.650712  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:03.650724  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:03.712614  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:03.705244   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.705869   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.706805   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.707331   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.708827   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:03.705244   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.705869   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.706805   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.707331   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:03.708827   12634 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:03.712625  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:03.712636  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:03.775226  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:03.775247  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:03.801581  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:03.801600  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:03.857991  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:03.858013  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:06.373018  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:06.382912  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:06.382972  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:06.408596  420062 cri.go:89] found id: ""
	I1217 20:34:06.408610  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.408617  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:06.408622  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:06.408681  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:06.437062  420062 cri.go:89] found id: ""
	I1217 20:34:06.437076  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.437083  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:06.437088  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:06.437149  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:06.463109  420062 cri.go:89] found id: ""
	I1217 20:34:06.463123  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.463130  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:06.463135  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:06.463198  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:06.487450  420062 cri.go:89] found id: ""
	I1217 20:34:06.487463  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.487470  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:06.487476  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:06.487537  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:06.512848  420062 cri.go:89] found id: ""
	I1217 20:34:06.512863  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.512870  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:06.512876  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:06.512939  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:06.536984  420062 cri.go:89] found id: ""
	I1217 20:34:06.536998  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.537006  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:06.537011  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:06.537069  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:06.565689  420062 cri.go:89] found id: ""
	I1217 20:34:06.565732  420062 logs.go:282] 0 containers: []
	W1217 20:34:06.565740  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:06.565748  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:06.565758  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:06.626274  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:06.626294  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:06.641612  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:06.641630  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:06.703082  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:06.694717   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.695365   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697091   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697739   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.699357   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:06.694717   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.695365   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697091   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.697739   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:06.699357   12747 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:06.703092  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:06.703104  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:06.768202  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:06.768221  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:09.296397  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:09.306558  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:09.306619  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:09.330814  420062 cri.go:89] found id: ""
	I1217 20:34:09.330828  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.330836  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:09.330841  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:09.330900  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:09.360228  420062 cri.go:89] found id: ""
	I1217 20:34:09.360242  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.360270  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:09.360276  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:09.360336  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:09.383852  420062 cri.go:89] found id: ""
	I1217 20:34:09.383865  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.383871  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:09.383876  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:09.383933  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:09.408740  420062 cri.go:89] found id: ""
	I1217 20:34:09.408753  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.408760  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:09.408765  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:09.408824  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:09.433879  420062 cri.go:89] found id: ""
	I1217 20:34:09.433894  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.433901  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:09.433907  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:09.433965  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:09.458138  420062 cri.go:89] found id: ""
	I1217 20:34:09.458152  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.458160  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:09.458165  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:09.458223  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:09.482170  420062 cri.go:89] found id: ""
	I1217 20:34:09.482184  420062 logs.go:282] 0 containers: []
	W1217 20:34:09.482191  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:09.482199  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:09.482214  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:09.539809  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:09.539831  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:09.555108  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:09.555124  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:09.617755  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:09.608834   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.609449   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611182   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611721   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.613344   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:09.608834   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.609449   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611182   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.611721   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:09.613344   12853 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:09.617779  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:09.617790  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:09.680900  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:09.680920  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:12.217262  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:12.227378  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:12.227441  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:12.260904  420062 cri.go:89] found id: ""
	I1217 20:34:12.260918  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.260926  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:12.260931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:12.260991  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:12.290600  420062 cri.go:89] found id: ""
	I1217 20:34:12.290614  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.290621  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:12.290626  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:12.290694  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:12.317694  420062 cri.go:89] found id: ""
	I1217 20:34:12.317708  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.317716  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:12.317721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:12.317789  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:12.347280  420062 cri.go:89] found id: ""
	I1217 20:34:12.347300  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.347308  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:12.347323  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:12.347382  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:12.375032  420062 cri.go:89] found id: ""
	I1217 20:34:12.375046  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.375054  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:12.375060  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:12.375121  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:12.400749  420062 cri.go:89] found id: ""
	I1217 20:34:12.400763  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.400771  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:12.400779  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:12.400837  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:12.425915  420062 cri.go:89] found id: ""
	I1217 20:34:12.425929  420062 logs.go:282] 0 containers: []
	W1217 20:34:12.425937  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:12.425946  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:12.425957  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:12.486250  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:12.486269  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:12.501500  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:12.501515  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:12.571896  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:12.563218   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564136   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564679   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566300   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566801   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:12.563218   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564136   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.564679   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566300   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:12.566801   12958 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:12.571906  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:12.571921  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:12.635853  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:12.635876  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:15.166604  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:15.177581  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:15.177645  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:15.201800  420062 cri.go:89] found id: ""
	I1217 20:34:15.201815  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.201822  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:15.201828  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:15.201892  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:15.229609  420062 cri.go:89] found id: ""
	I1217 20:34:15.229624  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.229631  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:15.229636  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:15.229703  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:15.257583  420062 cri.go:89] found id: ""
	I1217 20:34:15.257597  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.257605  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:15.257610  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:15.257673  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:15.291085  420062 cri.go:89] found id: ""
	I1217 20:34:15.291099  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.291106  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:15.291112  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:15.291190  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:15.324198  420062 cri.go:89] found id: ""
	I1217 20:34:15.324212  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.324219  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:15.324226  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:15.324317  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:15.348977  420062 cri.go:89] found id: ""
	I1217 20:34:15.348991  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.348998  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:15.349004  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:15.349069  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:15.373132  420062 cri.go:89] found id: ""
	I1217 20:34:15.373147  420062 logs.go:282] 0 containers: []
	W1217 20:34:15.373155  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:15.373162  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:15.373174  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:15.387711  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:15.387728  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:15.453164  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:15.443181   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.443915   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.445657   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.447470   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.448047   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:15.443181   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.443915   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.445657   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.447470   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:15.448047   13061 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:15.453175  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:15.453187  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:15.519197  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:15.519219  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:15.547781  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:15.547799  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:18.106475  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:18.117557  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:18.117619  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:18.142233  420062 cri.go:89] found id: ""
	I1217 20:34:18.142246  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.142253  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:18.142258  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:18.142319  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:18.166913  420062 cri.go:89] found id: ""
	I1217 20:34:18.166927  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.166934  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:18.166940  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:18.167002  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:18.195856  420062 cri.go:89] found id: ""
	I1217 20:34:18.195870  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.195877  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:18.195883  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:18.195944  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:18.222291  420062 cri.go:89] found id: ""
	I1217 20:34:18.222306  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.222313  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:18.222318  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:18.222382  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:18.254911  420062 cri.go:89] found id: ""
	I1217 20:34:18.254925  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.254932  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:18.254937  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:18.254996  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:18.299082  420062 cri.go:89] found id: ""
	I1217 20:34:18.299096  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.299103  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:18.299109  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:18.299173  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:18.323848  420062 cri.go:89] found id: ""
	I1217 20:34:18.323862  420062 logs.go:282] 0 containers: []
	W1217 20:34:18.323869  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:18.323877  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:18.323888  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:18.381056  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:18.381082  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:18.395602  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:18.395617  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:18.459223  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:18.450909   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.451543   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453107   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453711   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.455276   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:18.450909   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.451543   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453107   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.453711   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:18.455276   13167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:18.459233  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:18.459244  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:18.522287  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:18.522307  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:21.051832  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:21.062206  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:21.062275  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:21.090124  420062 cri.go:89] found id: ""
	I1217 20:34:21.090139  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.090146  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:21.090151  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:21.090211  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:21.114268  420062 cri.go:89] found id: ""
	I1217 20:34:21.114282  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.114289  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:21.114294  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:21.114357  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:21.141585  420062 cri.go:89] found id: ""
	I1217 20:34:21.141599  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.141606  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:21.141611  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:21.141673  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:21.167173  420062 cri.go:89] found id: ""
	I1217 20:34:21.167187  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.167195  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:21.167200  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:21.167277  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:21.191543  420062 cri.go:89] found id: ""
	I1217 20:34:21.191557  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.191564  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:21.191569  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:21.191640  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:21.219365  420062 cri.go:89] found id: ""
	I1217 20:34:21.219378  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.219385  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:21.219390  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:21.219451  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:21.256303  420062 cri.go:89] found id: ""
	I1217 20:34:21.256317  420062 logs.go:282] 0 containers: []
	W1217 20:34:21.256324  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:21.256332  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:21.256342  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:21.323014  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:21.323035  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:21.337647  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:21.337664  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:21.400131  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:21.391524   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.392409   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394006   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394305   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.395921   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:21.391524   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.392409   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394006   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.394305   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:21.395921   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:21.400140  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:21.400151  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:21.467704  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:21.467725  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:23.996278  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:24.008421  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:24.008487  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:24.035322  420062 cri.go:89] found id: ""
	I1217 20:34:24.035336  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.035344  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:24.035349  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:24.035413  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:24.060026  420062 cri.go:89] found id: ""
	I1217 20:34:24.060040  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.060048  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:24.060054  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:24.060131  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:24.085236  420062 cri.go:89] found id: ""
	I1217 20:34:24.085250  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.085257  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:24.085263  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:24.085323  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:24.110730  420062 cri.go:89] found id: ""
	I1217 20:34:24.110763  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.110772  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:24.110778  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:24.110851  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:24.138006  420062 cri.go:89] found id: ""
	I1217 20:34:24.138020  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.138028  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:24.138034  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:24.138094  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:24.168065  420062 cri.go:89] found id: ""
	I1217 20:34:24.168080  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.168094  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:24.168100  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:24.168172  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:24.193244  420062 cri.go:89] found id: ""
	I1217 20:34:24.193258  420062 logs.go:282] 0 containers: []
	W1217 20:34:24.193265  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:24.193273  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:24.193284  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:24.260181  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:24.260201  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:24.299429  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:24.299446  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:24.355633  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:24.355653  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:24.371493  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:24.371508  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:24.439767  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:24.431274   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.432160   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.433728   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.434387   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.435768   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:24.431274   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.432160   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.433728   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.434387   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:24.435768   13389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:26.940651  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:26.951081  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:26.951148  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:26.975583  420062 cri.go:89] found id: ""
	I1217 20:34:26.975598  420062 logs.go:282] 0 containers: []
	W1217 20:34:26.975606  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:26.975611  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:26.975671  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:27.003924  420062 cri.go:89] found id: ""
	I1217 20:34:27.003939  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.003948  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:27.003954  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:27.004018  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:27.029433  420062 cri.go:89] found id: ""
	I1217 20:34:27.029446  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.029454  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:27.029460  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:27.029520  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:27.055977  420062 cri.go:89] found id: ""
	I1217 20:34:27.055990  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.055998  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:27.056027  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:27.056093  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:27.081756  420062 cri.go:89] found id: ""
	I1217 20:34:27.081770  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.081777  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:27.081783  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:27.081846  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:27.106532  420062 cri.go:89] found id: ""
	I1217 20:34:27.106546  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.106554  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:27.106587  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:27.106651  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:27.131573  420062 cri.go:89] found id: ""
	I1217 20:34:27.131587  420062 logs.go:282] 0 containers: []
	W1217 20:34:27.131595  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:27.131603  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:27.131613  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:27.194270  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:27.194290  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:27.222438  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:27.222453  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:27.284134  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:27.284154  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:27.300336  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:27.300352  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:27.369337  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:27.360889   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.361563   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363199   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363762   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.365407   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:27.360889   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.361563   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363199   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.363762   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:27.365407   13496 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:29.871004  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:29.881325  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:29.881389  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:29.906739  420062 cri.go:89] found id: ""
	I1217 20:34:29.906753  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.906760  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:29.906766  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:29.906828  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:29.935023  420062 cri.go:89] found id: ""
	I1217 20:34:29.935037  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.935045  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:29.935049  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:29.935110  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:29.968427  420062 cri.go:89] found id: ""
	I1217 20:34:29.968442  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.968449  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:29.968454  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:29.968514  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:29.993120  420062 cri.go:89] found id: ""
	I1217 20:34:29.993133  420062 logs.go:282] 0 containers: []
	W1217 20:34:29.993141  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:29.993147  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:29.993208  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:30.038216  420062 cri.go:89] found id: ""
	I1217 20:34:30.038232  420062 logs.go:282] 0 containers: []
	W1217 20:34:30.038240  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:30.038256  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:30.038331  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:30.088044  420062 cri.go:89] found id: ""
	I1217 20:34:30.088059  420062 logs.go:282] 0 containers: []
	W1217 20:34:30.088067  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:30.088080  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:30.088145  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:30.116773  420062 cri.go:89] found id: ""
	I1217 20:34:30.116789  420062 logs.go:282] 0 containers: []
	W1217 20:34:30.116798  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:30.116808  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:30.116819  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:30.175618  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:30.175638  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:30.191950  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:30.191967  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:30.268938  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:30.259892   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.260676   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262229   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262537   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.263971   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:30.259892   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.260676   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262229   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.262537   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:30.263971   13586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:30.268949  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:30.268960  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:30.345609  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:30.345631  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:32.873852  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:32.884009  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:32.884072  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:32.908673  420062 cri.go:89] found id: ""
	I1217 20:34:32.908688  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.908696  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:32.908701  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:32.908761  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:32.933101  420062 cri.go:89] found id: ""
	I1217 20:34:32.933115  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.933122  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:32.933127  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:32.933192  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:32.956968  420062 cri.go:89] found id: ""
	I1217 20:34:32.956982  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.956991  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:32.956996  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:32.957054  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:32.982228  420062 cri.go:89] found id: ""
	I1217 20:34:32.982241  420062 logs.go:282] 0 containers: []
	W1217 20:34:32.982249  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:32.982254  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:32.982312  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:33.011791  420062 cri.go:89] found id: ""
	I1217 20:34:33.011805  420062 logs.go:282] 0 containers: []
	W1217 20:34:33.011812  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:33.011818  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:33.011885  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:33.038878  420062 cri.go:89] found id: ""
	I1217 20:34:33.038894  420062 logs.go:282] 0 containers: []
	W1217 20:34:33.038901  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:33.038907  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:33.038969  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:33.068421  420062 cri.go:89] found id: ""
	I1217 20:34:33.068436  420062 logs.go:282] 0 containers: []
	W1217 20:34:33.068443  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:33.068453  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:33.068463  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:33.083444  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:33.083461  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:33.147593  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:33.139067   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.139640   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141533   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141989   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.143520   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:33.139067   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.139640   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141533   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.141989   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:33.143520   13691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:33.147604  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:33.147617  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:33.211005  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:33.211025  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:33.247311  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:33.247327  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:35.820692  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:35.830805  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:35.830879  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:35.855694  420062 cri.go:89] found id: ""
	I1217 20:34:35.855708  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.855716  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:35.855721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:35.855780  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:35.879932  420062 cri.go:89] found id: ""
	I1217 20:34:35.879947  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.879955  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:35.879960  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:35.880021  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:35.904606  420062 cri.go:89] found id: ""
	I1217 20:34:35.904622  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.904630  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:35.904635  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:35.904700  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:35.932655  420062 cri.go:89] found id: ""
	I1217 20:34:35.932669  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.932676  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:35.932681  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:35.932742  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:35.956665  420062 cri.go:89] found id: ""
	I1217 20:34:35.956679  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.956686  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:35.956691  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:35.956748  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:35.981363  420062 cri.go:89] found id: ""
	I1217 20:34:35.981377  420062 logs.go:282] 0 containers: []
	W1217 20:34:35.981385  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:35.981391  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:35.981450  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:36.013052  420062 cri.go:89] found id: ""
	I1217 20:34:36.013068  420062 logs.go:282] 0 containers: []
	W1217 20:34:36.013076  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:36.013084  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:36.013097  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:36.080346  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:36.080367  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:36.109280  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:36.109296  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:36.168612  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:36.168630  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:36.183490  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:36.183505  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:36.254206  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:36.245334   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.246226   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.247937   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.248300   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.249802   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:36.245334   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.246226   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.247937   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.248300   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:36.249802   13808 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:38.754461  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:38.764820  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:38.764885  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:38.790226  420062 cri.go:89] found id: ""
	I1217 20:34:38.790243  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.790251  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:38.790257  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:38.790317  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:38.815898  420062 cri.go:89] found id: ""
	I1217 20:34:38.815913  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.815920  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:38.815925  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:38.815986  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:38.840879  420062 cri.go:89] found id: ""
	I1217 20:34:38.840894  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.840901  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:38.840907  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:38.840967  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:38.865756  420062 cri.go:89] found id: ""
	I1217 20:34:38.865772  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.865780  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:38.865785  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:38.865851  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:38.893497  420062 cri.go:89] found id: ""
	I1217 20:34:38.893511  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.893518  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:38.893523  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:38.893582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:38.918737  420062 cri.go:89] found id: ""
	I1217 20:34:38.918751  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.918758  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:38.918763  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:38.918821  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:38.943126  420062 cri.go:89] found id: ""
	I1217 20:34:38.943140  420062 logs.go:282] 0 containers: []
	W1217 20:34:38.943147  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:38.943155  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:38.943166  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:39.008933  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:38.999020   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.000025   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.001953   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.002737   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.004715   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:38.999020   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.000025   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.001953   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.002737   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:39.004715   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:39.008944  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:39.008955  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:39.071529  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:39.071550  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:39.098851  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:39.098866  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:39.157559  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:39.157578  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:41.673292  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:41.683569  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:41.683631  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:41.712444  420062 cri.go:89] found id: ""
	I1217 20:34:41.712458  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.712466  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:41.712471  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:41.712540  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:41.737230  420062 cri.go:89] found id: ""
	I1217 20:34:41.737244  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.737253  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:41.737258  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:41.737320  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:41.765904  420062 cri.go:89] found id: ""
	I1217 20:34:41.765918  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.765926  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:41.765931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:41.765993  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:41.790803  420062 cri.go:89] found id: ""
	I1217 20:34:41.790818  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.790826  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:41.790831  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:41.790891  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:41.816378  420062 cri.go:89] found id: ""
	I1217 20:34:41.816393  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.816399  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:41.816405  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:41.816465  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:41.846163  420062 cri.go:89] found id: ""
	I1217 20:34:41.846177  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.846184  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:41.846190  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:41.846249  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:41.874235  420062 cri.go:89] found id: ""
	I1217 20:34:41.874249  420062 logs.go:282] 0 containers: []
	W1217 20:34:41.874257  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:41.874264  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:41.874278  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:41.930007  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:41.930025  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:41.944733  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:41.944748  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:42.015145  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:42.005958   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.007326   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.008416   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009480   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009948   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:42.005958   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.007326   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.008416   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009480   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:42.009948   14003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:42.015157  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:42.015168  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:42.083018  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:42.083046  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:44.617783  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:44.627898  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:44.627959  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:44.654510  420062 cri.go:89] found id: ""
	I1217 20:34:44.654524  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.654531  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:44.654536  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:44.654600  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:44.681532  420062 cri.go:89] found id: ""
	I1217 20:34:44.681547  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.681554  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:44.681560  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:44.681620  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:44.705927  420062 cri.go:89] found id: ""
	I1217 20:34:44.705941  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.705948  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:44.705953  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:44.706010  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:44.730835  420062 cri.go:89] found id: ""
	I1217 20:34:44.730849  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.730857  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:44.730862  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:44.730925  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:44.754987  420062 cri.go:89] found id: ""
	I1217 20:34:44.755002  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.755009  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:44.755014  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:44.755074  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:44.778787  420062 cri.go:89] found id: ""
	I1217 20:34:44.778801  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.778808  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:44.778814  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:44.778874  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:44.804370  420062 cri.go:89] found id: ""
	I1217 20:34:44.804385  420062 logs.go:282] 0 containers: []
	W1217 20:34:44.804392  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:44.804401  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:44.804411  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:44.870852  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:44.870872  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:44.901529  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:44.901545  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:44.961405  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:44.961428  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:44.976411  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:44.976427  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:45.055180  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:45.045055   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.046486   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.047127   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.048790   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.049451   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:45.045055   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.046486   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.047127   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.048790   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:45.049451   14122 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:47.555437  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:47.565320  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:47.565380  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:47.594473  420062 cri.go:89] found id: ""
	I1217 20:34:47.594488  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.594495  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:47.594500  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:47.594560  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:47.618819  420062 cri.go:89] found id: ""
	I1217 20:34:47.618833  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.618840  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:47.618845  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:47.618906  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:47.643299  420062 cri.go:89] found id: ""
	I1217 20:34:47.643313  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.643320  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:47.643325  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:47.643386  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:47.668500  420062 cri.go:89] found id: ""
	I1217 20:34:47.668514  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.668522  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:47.668527  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:47.668588  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:47.694650  420062 cri.go:89] found id: ""
	I1217 20:34:47.694671  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.694678  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:47.694683  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:47.694745  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:47.729169  420062 cri.go:89] found id: ""
	I1217 20:34:47.729183  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.729192  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:47.729197  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:47.729258  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:47.753481  420062 cri.go:89] found id: ""
	I1217 20:34:47.753494  420062 logs.go:282] 0 containers: []
	W1217 20:34:47.753501  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:47.753509  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:47.753521  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:47.768175  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:47.768192  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:47.832224  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:47.823643   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.824432   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826211   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826814   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.828509   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:47.823643   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.824432   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826211   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.826814   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:47.828509   14212 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:47.832234  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:47.832264  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:47.894275  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:47.894294  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:47.921621  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:47.921638  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:50.477347  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:50.487837  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:50.487905  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:50.515440  420062 cri.go:89] found id: ""
	I1217 20:34:50.515460  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.515468  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:50.515473  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:50.515545  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:50.542521  420062 cri.go:89] found id: ""
	I1217 20:34:50.542546  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.542553  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:50.542559  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:50.542629  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:50.569586  420062 cri.go:89] found id: ""
	I1217 20:34:50.569600  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.569613  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:50.569618  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:50.569677  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:50.597938  420062 cri.go:89] found id: ""
	I1217 20:34:50.597951  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.597958  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:50.597966  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:50.598024  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:50.627019  420062 cri.go:89] found id: ""
	I1217 20:34:50.627044  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.627052  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:50.627057  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:50.627128  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:50.655921  420062 cri.go:89] found id: ""
	I1217 20:34:50.655948  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.655956  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:50.655962  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:50.656028  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:50.680457  420062 cri.go:89] found id: ""
	I1217 20:34:50.680471  420062 logs.go:282] 0 containers: []
	W1217 20:34:50.680479  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:50.680487  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:50.680502  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:50.742350  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:50.734040   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.734460   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736277   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736697   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.738252   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:50.734040   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.734460   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736277   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.736697   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:50.738252   14311 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:50.742360  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:50.742370  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:50.802977  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:50.802997  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:50.830354  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:50.830370  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:50.887850  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:50.887869  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:53.403065  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:53.413162  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:53.413227  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:53.437500  420062 cri.go:89] found id: ""
	I1217 20:34:53.437513  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.437521  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:53.437526  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:53.437592  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:53.462889  420062 cri.go:89] found id: ""
	I1217 20:34:53.462902  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.462910  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:53.462915  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:53.462972  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:53.493212  420062 cri.go:89] found id: ""
	I1217 20:34:53.493226  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.493234  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:53.493239  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:53.493301  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:53.521829  420062 cri.go:89] found id: ""
	I1217 20:34:53.521844  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.521851  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:53.521857  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:53.521919  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:53.558427  420062 cri.go:89] found id: ""
	I1217 20:34:53.558442  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.558449  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:53.558454  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:53.558513  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:53.583439  420062 cri.go:89] found id: ""
	I1217 20:34:53.583453  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.583460  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:53.583466  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:53.583526  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:53.608693  420062 cri.go:89] found id: ""
	I1217 20:34:53.608707  420062 logs.go:282] 0 containers: []
	W1217 20:34:53.608714  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:53.608722  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:53.608732  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:53.664959  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:53.664980  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:53.679865  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:53.679886  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:53.742568  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:53.733840   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.734623   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736275   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736848   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.738561   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:53.733840   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.734623   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736275   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.736848   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:53.738561   14422 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:53.742579  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:53.742591  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:53.803297  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:53.803317  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:56.335304  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:56.344915  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:56.344977  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:56.368289  420062 cri.go:89] found id: ""
	I1217 20:34:56.368304  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.368312  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:56.368319  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:56.368388  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:56.392693  420062 cri.go:89] found id: ""
	I1217 20:34:56.392707  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.392715  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:56.392721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:56.392782  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:56.419795  420062 cri.go:89] found id: ""
	I1217 20:34:56.419809  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.419825  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:56.419834  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:56.419902  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:56.445038  420062 cri.go:89] found id: ""
	I1217 20:34:56.445052  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.445060  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:56.445065  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:56.445128  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:56.474272  420062 cri.go:89] found id: ""
	I1217 20:34:56.474287  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.474294  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:56.474300  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:56.474366  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:56.507935  420062 cri.go:89] found id: ""
	I1217 20:34:56.507950  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.507957  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:56.507963  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:56.508030  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:56.535999  420062 cri.go:89] found id: ""
	I1217 20:34:56.536012  420062 logs.go:282] 0 containers: []
	W1217 20:34:56.536030  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:56.536039  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:56.536050  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:56.572020  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:56.572037  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:56.628661  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:56.628681  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:34:56.643833  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:56.643856  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:56.710351  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:56.701895   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.702686   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704396   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704960   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.706438   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:56.701895   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.702686   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704396   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.704960   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:56.706438   14535 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:56.710361  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:56.710380  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:59.273579  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:34:59.283581  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:34:59.283645  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:34:59.309480  420062 cri.go:89] found id: ""
	I1217 20:34:59.309493  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.309500  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:34:59.309506  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:34:59.309564  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:34:59.333365  420062 cri.go:89] found id: ""
	I1217 20:34:59.333378  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.333386  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:34:59.333391  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:34:59.333452  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:34:59.357207  420062 cri.go:89] found id: ""
	I1217 20:34:59.357221  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.357228  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:34:59.357233  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:34:59.357298  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:34:59.381758  420062 cri.go:89] found id: ""
	I1217 20:34:59.381772  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.381781  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:34:59.381787  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:34:59.381845  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:34:59.406750  420062 cri.go:89] found id: ""
	I1217 20:34:59.406764  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.406772  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:34:59.406777  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:34:59.406845  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:34:59.431825  420062 cri.go:89] found id: ""
	I1217 20:34:59.431838  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.431846  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:34:59.431852  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:34:59.431913  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:34:59.458993  420062 cri.go:89] found id: ""
	I1217 20:34:59.459007  420062 logs.go:282] 0 containers: []
	W1217 20:34:59.459014  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:34:59.459022  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:34:59.459041  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:34:59.546381  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:34:59.527767   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.528143   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.536500   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.537248   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.538811   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:34:59.527767   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.528143   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.536500   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.537248   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:34:59.538811   14616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:34:59.546391  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:34:59.546401  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:34:59.613987  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:34:59.614007  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:34:59.644296  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:34:59.644311  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:34:59.703226  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:34:59.703245  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:02.218783  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:02.229042  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:02.229114  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:02.254286  420062 cri.go:89] found id: ""
	I1217 20:35:02.254300  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.254308  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:02.254315  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:02.254374  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:02.281092  420062 cri.go:89] found id: ""
	I1217 20:35:02.281106  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.281114  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:02.281120  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:02.281198  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:02.310195  420062 cri.go:89] found id: ""
	I1217 20:35:02.310209  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.310217  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:02.310222  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:02.310294  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:02.338807  420062 cri.go:89] found id: ""
	I1217 20:35:02.338821  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.338829  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:02.338834  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:02.338904  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:02.364604  420062 cri.go:89] found id: ""
	I1217 20:35:02.364618  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.364625  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:02.364631  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:02.364693  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:02.389458  420062 cri.go:89] found id: ""
	I1217 20:35:02.389473  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.389481  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:02.389486  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:02.389544  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:02.419120  420062 cri.go:89] found id: ""
	I1217 20:35:02.419134  420062 logs.go:282] 0 containers: []
	W1217 20:35:02.419142  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:02.419151  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:02.419162  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:02.476620  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:02.476640  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:02.492411  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:02.492428  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:02.567285  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:02.558682   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.559341   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.560957   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.561461   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.562999   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:02.558682   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.559341   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.560957   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.561461   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:02.562999   14725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:02.567294  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:02.567308  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:02.635002  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:02.635022  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:05.163567  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:05.174184  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:05.174245  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:05.199116  420062 cri.go:89] found id: ""
	I1217 20:35:05.199130  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.199137  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:05.199143  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:05.199206  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:05.223477  420062 cri.go:89] found id: ""
	I1217 20:35:05.223491  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.223498  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:05.223504  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:05.223562  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:05.247303  420062 cri.go:89] found id: ""
	I1217 20:35:05.247317  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.247325  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:05.247332  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:05.247391  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:05.272620  420062 cri.go:89] found id: ""
	I1217 20:35:05.272633  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.272641  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:05.272646  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:05.272703  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:05.300419  420062 cri.go:89] found id: ""
	I1217 20:35:05.300434  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.300441  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:05.300446  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:05.300505  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:05.325851  420062 cri.go:89] found id: ""
	I1217 20:35:05.325866  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.325873  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:05.325879  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:05.325938  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:05.354430  420062 cri.go:89] found id: ""
	I1217 20:35:05.354445  420062 logs.go:282] 0 containers: []
	W1217 20:35:05.354452  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:05.354460  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:05.354475  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:05.369668  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:05.369686  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:05.436390  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:05.427472   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.428087   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.429823   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.430630   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.432463   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:05.427472   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.428087   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.429823   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.430630   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:05.432463   14829 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:05.436400  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:05.436411  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:05.499177  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:05.499202  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:05.531231  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:05.531248  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:08.088375  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:08.098640  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:08.098711  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:08.132112  420062 cri.go:89] found id: ""
	I1217 20:35:08.132127  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.132136  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:08.132141  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:08.132205  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:08.157778  420062 cri.go:89] found id: ""
	I1217 20:35:08.157792  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.157800  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:08.157805  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:08.157862  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:08.183372  420062 cri.go:89] found id: ""
	I1217 20:35:08.183386  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.183393  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:08.183399  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:08.183457  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:08.208186  420062 cri.go:89] found id: ""
	I1217 20:35:08.208200  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.208207  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:08.208212  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:08.208310  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:08.236181  420062 cri.go:89] found id: ""
	I1217 20:35:08.236195  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.236202  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:08.236207  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:08.236313  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:08.261508  420062 cri.go:89] found id: ""
	I1217 20:35:08.261522  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.261529  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:08.261534  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:08.261593  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:08.286303  420062 cri.go:89] found id: ""
	I1217 20:35:08.286318  420062 logs.go:282] 0 containers: []
	W1217 20:35:08.286325  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:08.286333  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:08.286349  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:08.345547  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:08.345573  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:08.360551  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:08.360568  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:08.424581  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:08.415775   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.416570   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.418257   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.419005   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.420742   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:08.415775   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.416570   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.418257   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.419005   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:08.420742   14936 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:08.424593  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:08.424606  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:08.489146  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:08.489166  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:11.022570  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:11.034138  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:11.034205  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:11.066795  420062 cri.go:89] found id: ""
	I1217 20:35:11.066810  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.066817  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:11.066825  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:11.066888  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:11.092902  420062 cri.go:89] found id: ""
	I1217 20:35:11.092917  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.092925  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:11.092931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:11.092998  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:11.120040  420062 cri.go:89] found id: ""
	I1217 20:35:11.120056  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.120064  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:11.120069  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:11.120138  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:11.150096  420062 cri.go:89] found id: ""
	I1217 20:35:11.150111  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.150118  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:11.150124  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:11.150186  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:11.178952  420062 cri.go:89] found id: ""
	I1217 20:35:11.178966  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.178973  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:11.178979  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:11.179042  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:11.205194  420062 cri.go:89] found id: ""
	I1217 20:35:11.205208  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.205215  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:11.205221  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:11.205281  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:11.231314  420062 cri.go:89] found id: ""
	I1217 20:35:11.231327  420062 logs.go:282] 0 containers: []
	W1217 20:35:11.231335  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:11.231343  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:11.231355  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:11.246458  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:11.246475  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:11.312684  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:11.304393   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.305171   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.306693   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.307058   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.308710   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:11.304393   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.305171   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.306693   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.307058   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:11.308710   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:11.312696  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:11.312706  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:11.379354  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:11.379374  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:11.413484  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:11.413500  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:13.972078  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:13.982223  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:13.982290  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:14.022488  420062 cri.go:89] found id: ""
	I1217 20:35:14.022502  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.022510  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:14.022515  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:14.022575  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:14.059328  420062 cri.go:89] found id: ""
	I1217 20:35:14.059342  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.059364  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:14.059369  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:14.059435  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:14.085531  420062 cri.go:89] found id: ""
	I1217 20:35:14.085544  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.085552  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:14.085558  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:14.085616  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:14.114113  420062 cri.go:89] found id: ""
	I1217 20:35:14.114134  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.114141  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:14.114147  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:14.114210  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:14.138505  420062 cri.go:89] found id: ""
	I1217 20:35:14.138519  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.138526  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:14.138532  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:14.138591  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:14.162838  420062 cri.go:89] found id: ""
	I1217 20:35:14.162852  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.162858  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:14.162863  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:14.162923  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:14.190631  420062 cri.go:89] found id: ""
	I1217 20:35:14.190651  420062 logs.go:282] 0 containers: []
	W1217 20:35:14.190665  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:14.190672  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:14.190682  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:14.246544  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:14.246563  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:14.261703  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:14.261719  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:14.327698  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:14.319587   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.320376   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322035   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322354   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.323849   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:14.319587   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.320376   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322035   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.322354   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:14.323849   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:14.327708  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:14.327721  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:14.391616  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:14.391635  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:16.921553  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:16.931542  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:16.931604  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:16.955206  420062 cri.go:89] found id: ""
	I1217 20:35:16.955220  420062 logs.go:282] 0 containers: []
	W1217 20:35:16.955227  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:16.955233  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:16.955291  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:16.984598  420062 cri.go:89] found id: ""
	I1217 20:35:16.984613  420062 logs.go:282] 0 containers: []
	W1217 20:35:16.984620  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:16.984625  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:16.984683  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:17.033712  420062 cri.go:89] found id: ""
	I1217 20:35:17.033726  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.033733  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:17.033739  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:17.033796  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:17.061936  420062 cri.go:89] found id: ""
	I1217 20:35:17.061950  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.061957  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:17.061963  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:17.062023  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:17.086921  420062 cri.go:89] found id: ""
	I1217 20:35:17.086936  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.086943  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:17.086948  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:17.087009  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:17.112474  420062 cri.go:89] found id: ""
	I1217 20:35:17.112488  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.112495  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:17.112501  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:17.112558  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:17.137847  420062 cri.go:89] found id: ""
	I1217 20:35:17.137867  420062 logs.go:282] 0 containers: []
	W1217 20:35:17.137875  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:17.137882  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:17.137892  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:17.198885  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:17.198904  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:17.213637  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:17.213652  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:17.281467  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:17.272943   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.273684   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275273   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275893   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.277419   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:17.272943   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.273684   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275273   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.275893   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:17.277419   15252 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:17.281478  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:17.281488  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:17.343313  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:17.343334  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:19.871984  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:19.882066  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:19.882128  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:19.907664  420062 cri.go:89] found id: ""
	I1217 20:35:19.907678  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.907686  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:19.907691  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:19.907750  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:19.936014  420062 cri.go:89] found id: ""
	I1217 20:35:19.936028  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.936035  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:19.936040  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:19.936099  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:19.961865  420062 cri.go:89] found id: ""
	I1217 20:35:19.961881  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.961888  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:19.961893  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:19.961954  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:19.988749  420062 cri.go:89] found id: ""
	I1217 20:35:19.988762  420062 logs.go:282] 0 containers: []
	W1217 20:35:19.988769  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:19.988775  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:19.988832  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:20.021844  420062 cri.go:89] found id: ""
	I1217 20:35:20.021859  420062 logs.go:282] 0 containers: []
	W1217 20:35:20.021866  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:20.021873  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:20.021936  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:20.064328  420062 cri.go:89] found id: ""
	I1217 20:35:20.064343  420062 logs.go:282] 0 containers: []
	W1217 20:35:20.064351  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:20.064356  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:20.064464  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:20.092230  420062 cri.go:89] found id: ""
	I1217 20:35:20.092244  420062 logs.go:282] 0 containers: []
	W1217 20:35:20.092272  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:20.092280  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:20.092291  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:20.150597  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:20.150617  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:20.166734  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:20.166751  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:20.235344  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:20.226511   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.227342   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.228855   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.229349   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.230876   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:20.226511   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.227342   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.228855   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.229349   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:20.230876   15357 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:20.235354  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:20.235368  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:20.300971  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:20.300991  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:22.830503  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:22.840565  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:22.840627  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:22.865965  420062 cri.go:89] found id: ""
	I1217 20:35:22.865980  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.865987  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:22.865992  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:22.866051  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:22.890981  420062 cri.go:89] found id: ""
	I1217 20:35:22.890995  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.891002  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:22.891007  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:22.891067  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:22.916050  420062 cri.go:89] found id: ""
	I1217 20:35:22.916064  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.916070  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:22.916075  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:22.916134  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:22.940231  420062 cri.go:89] found id: ""
	I1217 20:35:22.940244  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.940274  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:22.940280  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:22.940338  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:22.964651  420062 cri.go:89] found id: ""
	I1217 20:35:22.964665  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.964673  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:22.964678  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:22.964739  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:22.999102  420062 cri.go:89] found id: ""
	I1217 20:35:22.999118  420062 logs.go:282] 0 containers: []
	W1217 20:35:22.999126  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:22.999133  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:22.999201  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:23.031417  420062 cri.go:89] found id: ""
	I1217 20:35:23.031431  420062 logs.go:282] 0 containers: []
	W1217 20:35:23.031440  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:23.031447  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:23.031458  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:23.099279  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:23.099300  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:23.127896  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:23.127914  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:23.184706  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:23.184725  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:23.199879  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:23.199895  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:23.267184  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:23.258603   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.259294   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.260943   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.261532   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.263117   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:23.258603   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.259294   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.260943   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.261532   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:23.263117   15474 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:25.768885  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:25.778947  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:25.779017  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:25.802991  420062 cri.go:89] found id: ""
	I1217 20:35:25.803005  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.803025  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:25.803031  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:25.803093  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:25.830724  420062 cri.go:89] found id: ""
	I1217 20:35:25.830738  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.830745  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:25.830751  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:25.830813  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:25.860059  420062 cri.go:89] found id: ""
	I1217 20:35:25.860073  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.860081  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:25.860085  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:25.860150  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:25.896087  420062 cri.go:89] found id: ""
	I1217 20:35:25.896101  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.896108  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:25.896114  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:25.896173  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:25.921891  420062 cri.go:89] found id: ""
	I1217 20:35:25.921905  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.921912  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:25.921918  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:25.921975  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:25.946115  420062 cri.go:89] found id: ""
	I1217 20:35:25.946129  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.946137  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:25.946142  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:25.946199  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:25.970696  420062 cri.go:89] found id: ""
	I1217 20:35:25.970711  420062 logs.go:282] 0 containers: []
	W1217 20:35:25.970719  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:25.970727  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:25.970737  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:26.031476  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:26.031497  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:26.053026  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:26.053044  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:26.121268  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:26.112221   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.113175   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.114729   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.115229   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.116856   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:26.112221   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.113175   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.114729   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.115229   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:26.116856   15565 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:26.121279  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:26.121290  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:26.183866  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:26.183888  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:28.713125  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:28.723373  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:28.723436  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:28.750204  420062 cri.go:89] found id: ""
	I1217 20:35:28.750218  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.750225  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:28.750231  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:28.750295  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:28.774507  420062 cri.go:89] found id: ""
	I1217 20:35:28.774520  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.774528  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:28.774533  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:28.774593  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:28.799202  420062 cri.go:89] found id: ""
	I1217 20:35:28.799217  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.799225  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:28.799230  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:28.799295  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:28.823894  420062 cri.go:89] found id: ""
	I1217 20:35:28.823908  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.823916  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:28.823921  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:28.823981  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:28.848696  420062 cri.go:89] found id: ""
	I1217 20:35:28.848710  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.848717  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:28.848722  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:28.848780  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:28.874108  420062 cri.go:89] found id: ""
	I1217 20:35:28.874121  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.874129  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:28.874146  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:28.874206  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:28.899607  420062 cri.go:89] found id: ""
	I1217 20:35:28.899621  420062 logs.go:282] 0 containers: []
	W1217 20:35:28.899628  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:28.899636  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:28.899646  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:28.955990  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:28.956010  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:28.970828  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:28.970844  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:29.048596  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:29.039925   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.040773   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042371   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042731   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.044197   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:29.039925   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.040773   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042371   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.042731   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:29.044197   15669 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:29.048606  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:29.048627  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:29.115475  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:29.115495  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:31.644907  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:31.654819  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:31.654879  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:31.678281  420062 cri.go:89] found id: ""
	I1217 20:35:31.678295  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.678303  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:31.678308  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:31.678370  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:31.702902  420062 cri.go:89] found id: ""
	I1217 20:35:31.702916  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.702923  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:31.702929  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:31.702988  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:31.730614  420062 cri.go:89] found id: ""
	I1217 20:35:31.730629  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.730643  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:31.730648  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:31.730715  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:31.757724  420062 cri.go:89] found id: ""
	I1217 20:35:31.757738  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.757745  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:31.757751  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:31.757821  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:31.781313  420062 cri.go:89] found id: ""
	I1217 20:35:31.781326  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.781333  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:31.781338  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:31.781401  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:31.805048  420062 cri.go:89] found id: ""
	I1217 20:35:31.805061  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.805068  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:31.805074  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:31.805133  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:31.829157  420062 cri.go:89] found id: ""
	I1217 20:35:31.829172  420062 logs.go:282] 0 containers: []
	W1217 20:35:31.829178  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:31.829186  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:31.829211  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:31.884232  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:31.884262  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:31.899125  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:31.899143  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:31.960768  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:31.952914   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.953466   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.954986   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.955445   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.957040   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:31.952914   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.953466   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.954986   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.955445   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:31.957040   15777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:31.960779  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:31.960789  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:32.026560  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:32.026580  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:34.561956  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:34.573345  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:34.573414  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:34.601971  420062 cri.go:89] found id: ""
	I1217 20:35:34.601985  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.601993  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:34.601998  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:34.602057  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:34.631487  420062 cri.go:89] found id: ""
	I1217 20:35:34.631500  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.631508  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:34.631513  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:34.631572  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:34.656452  420062 cri.go:89] found id: ""
	I1217 20:35:34.656465  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.656473  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:34.656478  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:34.656540  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:34.682582  420062 cri.go:89] found id: ""
	I1217 20:35:34.682596  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.682603  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:34.682609  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:34.682676  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:34.713925  420062 cri.go:89] found id: ""
	I1217 20:35:34.713939  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.713947  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:34.713952  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:34.714017  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:34.742385  420062 cri.go:89] found id: ""
	I1217 20:35:34.742400  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.742408  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:34.742414  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:34.742473  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:34.767035  420062 cri.go:89] found id: ""
	I1217 20:35:34.767049  420062 logs.go:282] 0 containers: []
	W1217 20:35:34.767056  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:34.767064  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:34.767075  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:34.822796  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:34.822817  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:34.837590  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:34.837613  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:34.900508  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:34.892940   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.893576   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895113   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895412   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.896864   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:34.892940   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.893576   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895113   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.895412   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:34.896864   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:34.900518  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:34.900529  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:34.962881  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:34.962905  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:37.494984  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:37.505451  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:37.505514  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:37.530852  420062 cri.go:89] found id: ""
	I1217 20:35:37.530866  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.530874  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:37.530885  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:37.530948  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:37.555283  420062 cri.go:89] found id: ""
	I1217 20:35:37.555298  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.555305  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:37.555319  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:37.555384  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:37.580310  420062 cri.go:89] found id: ""
	I1217 20:35:37.580324  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.580342  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:37.580347  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:37.580407  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:37.604561  420062 cri.go:89] found id: ""
	I1217 20:35:37.604575  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.604582  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:37.604587  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:37.604649  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:37.633577  420062 cri.go:89] found id: ""
	I1217 20:35:37.633591  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.633598  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:37.633603  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:37.633668  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:37.659137  420062 cri.go:89] found id: ""
	I1217 20:35:37.659152  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.659159  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:37.659183  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:37.659280  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:37.687689  420062 cri.go:89] found id: ""
	I1217 20:35:37.687704  420062 logs.go:282] 0 containers: []
	W1217 20:35:37.687711  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:37.687719  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:37.687738  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:37.742459  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:37.742478  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:37.757175  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:37.757191  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:37.822005  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:37.813077   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.813679   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.815702   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.816474   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.817981   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:37.813077   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.813679   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.815702   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.816474   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:37.817981   15986 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:37.822015  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:37.822025  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:37.885848  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:37.885870  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:40.416602  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:40.427031  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:40.427099  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:40.452190  420062 cri.go:89] found id: ""
	I1217 20:35:40.452204  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.452212  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:40.452218  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:40.452299  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:40.478942  420062 cri.go:89] found id: ""
	I1217 20:35:40.478956  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.478963  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:40.478969  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:40.479027  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:40.504873  420062 cri.go:89] found id: ""
	I1217 20:35:40.504886  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.504893  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:40.504898  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:40.504958  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:40.530153  420062 cri.go:89] found id: ""
	I1217 20:35:40.530167  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.530173  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:40.530179  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:40.530239  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:40.558703  420062 cri.go:89] found id: ""
	I1217 20:35:40.558717  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.558725  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:40.558731  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:40.558799  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:40.583753  420062 cri.go:89] found id: ""
	I1217 20:35:40.583768  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.583777  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:40.583793  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:40.583856  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:40.608061  420062 cri.go:89] found id: ""
	I1217 20:35:40.608075  420062 logs.go:282] 0 containers: []
	W1217 20:35:40.608083  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:40.608099  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:40.608111  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:40.665201  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:40.665222  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:40.680290  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:40.680307  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:40.752424  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:40.739302   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.740073   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.745453   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.746616   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.748372   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:40.739302   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.740073   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.745453   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.746616   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:40.748372   16089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:40.752435  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:40.752446  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:40.819510  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:40.819535  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:43.356404  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:43.367228  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:43.367293  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:43.391809  420062 cri.go:89] found id: ""
	I1217 20:35:43.391824  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.391831  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:43.391836  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:43.391895  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:43.417869  420062 cri.go:89] found id: ""
	I1217 20:35:43.417883  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.417890  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:43.417895  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:43.417959  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:43.443314  420062 cri.go:89] found id: ""
	I1217 20:35:43.443328  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.443335  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:43.443340  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:43.443400  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:43.469332  420062 cri.go:89] found id: ""
	I1217 20:35:43.469346  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.469352  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:43.469358  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:43.469418  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:43.494242  420062 cri.go:89] found id: ""
	I1217 20:35:43.494256  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.494264  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:43.494277  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:43.494341  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:43.520502  420062 cri.go:89] found id: ""
	I1217 20:35:43.520515  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.520523  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:43.520529  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:43.520592  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:43.549390  420062 cri.go:89] found id: ""
	I1217 20:35:43.549404  420062 logs.go:282] 0 containers: []
	W1217 20:35:43.549411  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:43.549419  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:43.549435  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:43.565708  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:43.565725  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:43.633544  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:43.624678   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.625383   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627234   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627820   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.629497   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:43.624678   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.625383   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627234   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.627820   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:43.629497   16190 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:43.633555  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:43.633567  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:43.696433  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:43.696457  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:43.727227  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:43.727244  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:46.288373  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:46.298318  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:46.298381  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:46.322903  420062 cri.go:89] found id: ""
	I1217 20:35:46.322918  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.322925  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:46.322931  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:46.322992  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:46.347241  420062 cri.go:89] found id: ""
	I1217 20:35:46.347253  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.347260  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:46.347265  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:46.347324  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:46.372209  420062 cri.go:89] found id: ""
	I1217 20:35:46.372222  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.372229  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:46.372235  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:46.372313  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:46.399343  420062 cri.go:89] found id: ""
	I1217 20:35:46.399357  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.399365  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:46.399370  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:46.399430  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:46.425023  420062 cri.go:89] found id: ""
	I1217 20:35:46.425036  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.425051  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:46.425057  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:46.425119  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:46.450066  420062 cri.go:89] found id: ""
	I1217 20:35:46.450080  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.450087  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:46.450092  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:46.450153  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:46.474598  420062 cri.go:89] found id: ""
	I1217 20:35:46.474612  420062 logs.go:282] 0 containers: []
	W1217 20:35:46.474619  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:46.474644  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:46.474654  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:46.536781  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:46.536801  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:46.570140  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:46.570155  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:46.628870  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:46.628888  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:46.643875  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:46.643891  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:46.709883  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:46.701485   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.702111   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.703801   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.704133   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.705726   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:46.701485   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.702111   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.703801   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.704133   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:46.705726   16309 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:49.210139  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:49.220394  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:49.220461  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:49.256343  420062 cri.go:89] found id: ""
	I1217 20:35:49.256358  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.256365  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:49.256370  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:49.256431  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:49.290171  420062 cri.go:89] found id: ""
	I1217 20:35:49.290185  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.290193  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:49.290198  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:49.290261  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:49.320916  420062 cri.go:89] found id: ""
	I1217 20:35:49.320931  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.320939  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:49.320944  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:49.321003  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:49.345394  420062 cri.go:89] found id: ""
	I1217 20:35:49.345408  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.345415  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:49.345421  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:49.345478  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:49.370339  420062 cri.go:89] found id: ""
	I1217 20:35:49.370353  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.370360  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:49.370365  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:49.370424  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:49.394642  420062 cri.go:89] found id: ""
	I1217 20:35:49.394656  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.394663  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:49.394668  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:49.394734  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:49.422548  420062 cri.go:89] found id: ""
	I1217 20:35:49.422562  420062 logs.go:282] 0 containers: []
	W1217 20:35:49.422569  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:49.422577  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:49.422594  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:49.479225  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:49.479246  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:49.494238  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:49.494255  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:49.560086  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:49.552332   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.552825   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554311   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554738   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.556232   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:49.552332   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.552825   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554311   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.554738   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:49.556232   16400 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:49.560096  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:49.560106  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:49.622094  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:49.622114  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:52.150210  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:52.160168  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:52.160231  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:52.184746  420062 cri.go:89] found id: ""
	I1217 20:35:52.184760  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.184767  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:52.184779  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:52.184835  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:52.209501  420062 cri.go:89] found id: ""
	I1217 20:35:52.209515  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.209522  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:52.209528  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:52.209586  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:52.234558  420062 cri.go:89] found id: ""
	I1217 20:35:52.234571  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.234579  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:52.234584  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:52.234654  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:52.265703  420062 cri.go:89] found id: ""
	I1217 20:35:52.265716  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.265724  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:52.265729  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:52.265794  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:52.297248  420062 cri.go:89] found id: ""
	I1217 20:35:52.297263  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.297270  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:52.297275  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:52.297334  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:52.325342  420062 cri.go:89] found id: ""
	I1217 20:35:52.325355  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.325362  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:52.325367  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:52.325433  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:52.349812  420062 cri.go:89] found id: ""
	I1217 20:35:52.349826  420062 logs.go:282] 0 containers: []
	W1217 20:35:52.349843  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:52.349851  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:52.349862  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:52.380735  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:52.380751  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:52.436131  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:52.436151  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:52.451427  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:52.451445  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:52.518482  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:52.509497   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.510168   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.512343   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.513295   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.514564   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:52.509497   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.510168   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.512343   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.513295   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:52.514564   16512 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:52.518492  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:52.518503  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:55.081073  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:55.091720  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:55.091797  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:55.117311  420062 cri.go:89] found id: ""
	I1217 20:35:55.117325  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.117333  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:55.117338  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:55.117398  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:55.141668  420062 cri.go:89] found id: ""
	I1217 20:35:55.141683  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.141692  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:55.141697  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:55.141760  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:55.166517  420062 cri.go:89] found id: ""
	I1217 20:35:55.166534  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.166541  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:55.166546  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:55.166611  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:55.191282  420062 cri.go:89] found id: ""
	I1217 20:35:55.191296  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.191304  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:55.191309  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:55.191369  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:55.215605  420062 cri.go:89] found id: ""
	I1217 20:35:55.215619  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.215626  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:55.215631  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:55.215690  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:55.247101  420062 cri.go:89] found id: ""
	I1217 20:35:55.247124  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.247132  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:55.247137  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:55.247205  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:55.288704  420062 cri.go:89] found id: ""
	I1217 20:35:55.288718  420062 logs.go:282] 0 containers: []
	W1217 20:35:55.288725  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:55.288732  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:55.288743  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:55.320382  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:55.320398  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:55.379997  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:55.380016  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:55.394762  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:55.394780  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:55.459997  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:55.451538   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.452219   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.453851   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.454661   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.456164   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:55.451538   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.452219   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.453851   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.454661   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:55.456164   16616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:55.460007  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:55.460018  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:35:58.024408  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:35:58.035410  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:35:58.035478  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:35:58.062124  420062 cri.go:89] found id: ""
	I1217 20:35:58.062138  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.062145  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:35:58.062151  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:35:58.062211  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:35:58.088229  420062 cri.go:89] found id: ""
	I1217 20:35:58.088243  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.088270  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:35:58.088276  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:35:58.088335  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:35:58.113240  420062 cri.go:89] found id: ""
	I1217 20:35:58.113255  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.113261  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:35:58.113266  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:35:58.113325  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:35:58.141811  420062 cri.go:89] found id: ""
	I1217 20:35:58.141825  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.141832  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:35:58.141837  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:35:58.141897  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:35:58.170463  420062 cri.go:89] found id: ""
	I1217 20:35:58.170477  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.170484  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:35:58.170490  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:35:58.170548  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:35:58.194647  420062 cri.go:89] found id: ""
	I1217 20:35:58.194670  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.194678  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:35:58.194684  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:35:58.194760  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:35:58.219714  420062 cri.go:89] found id: ""
	I1217 20:35:58.219728  420062 logs.go:282] 0 containers: []
	W1217 20:35:58.219735  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:35:58.219743  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:35:58.219754  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:35:58.263178  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:35:58.263194  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:35:58.325412  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:35:58.325433  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:35:58.341419  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:35:58.341435  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:35:58.403135  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:35:58.394644   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.395273   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.396931   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.397587   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.399184   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:35:58.394644   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.395273   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.396931   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.397587   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:35:58.399184   16722 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:35:58.403147  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:35:58.403163  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:00.965498  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:00.975759  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:00.975820  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:01.000786  420062 cri.go:89] found id: ""
	I1217 20:36:01.000803  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.000811  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:01.000818  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:01.000892  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:01.025695  420062 cri.go:89] found id: ""
	I1217 20:36:01.025709  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.025716  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:01.025721  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:01.025784  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:01.054712  420062 cri.go:89] found id: ""
	I1217 20:36:01.054727  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.054734  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:01.054739  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:01.054799  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:01.083318  420062 cri.go:89] found id: ""
	I1217 20:36:01.083332  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.083340  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:01.083345  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:01.083406  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:01.107939  420062 cri.go:89] found id: ""
	I1217 20:36:01.107954  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.107962  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:01.107968  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:01.108030  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:01.134926  420062 cri.go:89] found id: ""
	I1217 20:36:01.134940  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.134947  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:01.134954  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:01.135018  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:01.161095  420062 cri.go:89] found id: ""
	I1217 20:36:01.161111  420062 logs.go:282] 0 containers: []
	W1217 20:36:01.161121  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:01.161130  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:01.161141  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:01.222094  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:01.222112  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:01.239432  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:01.239449  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:01.331243  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:01.322562   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.323102   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.324888   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.325430   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.327051   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:01.322562   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.323102   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.324888   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.325430   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:01.327051   16814 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:01.331254  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:01.331265  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:01.398128  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:01.398148  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:03.929660  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:03.940045  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:03.940111  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:03.963644  420062 cri.go:89] found id: ""
	I1217 20:36:03.963658  420062 logs.go:282] 0 containers: []
	W1217 20:36:03.963665  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:03.963670  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:03.963727  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:03.996893  420062 cri.go:89] found id: ""
	I1217 20:36:03.996907  420062 logs.go:282] 0 containers: []
	W1217 20:36:03.996914  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:03.996919  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:03.996987  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:04.028499  420062 cri.go:89] found id: ""
	I1217 20:36:04.028514  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.028530  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:04.028535  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:04.028607  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:04.054700  420062 cri.go:89] found id: ""
	I1217 20:36:04.054715  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.054723  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:04.054728  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:04.054785  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:04.082040  420062 cri.go:89] found id: ""
	I1217 20:36:04.082054  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.082063  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:04.082068  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:04.082131  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:04.107015  420062 cri.go:89] found id: ""
	I1217 20:36:04.107029  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.107037  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:04.107043  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:04.107109  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:04.134634  420062 cri.go:89] found id: ""
	I1217 20:36:04.134648  420062 logs.go:282] 0 containers: []
	W1217 20:36:04.134655  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:04.134663  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:04.134673  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:04.191059  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:04.191079  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:04.206280  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:04.206298  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:04.297698  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:04.288551   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.289379   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.290529   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.291295   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.292969   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:04.288551   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.289379   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.290529   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.291295   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:04.292969   16915 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:04.297708  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:04.297719  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:04.364378  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:04.364398  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:06.892149  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:06.902353  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:06.902418  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:06.927834  420062 cri.go:89] found id: ""
	I1217 20:36:06.927847  420062 logs.go:282] 0 containers: []
	W1217 20:36:06.927855  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:06.927860  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:06.927925  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:06.952936  420062 cri.go:89] found id: ""
	I1217 20:36:06.952949  420062 logs.go:282] 0 containers: []
	W1217 20:36:06.952956  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:06.952965  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:06.953024  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:06.976184  420062 cri.go:89] found id: ""
	I1217 20:36:06.976198  420062 logs.go:282] 0 containers: []
	W1217 20:36:06.976205  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:06.976210  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:06.976297  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:07.004079  420062 cri.go:89] found id: ""
	I1217 20:36:07.004093  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.004101  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:07.004106  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:07.004167  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:07.029604  420062 cri.go:89] found id: ""
	I1217 20:36:07.029618  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.029625  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:07.029630  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:07.029698  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:07.058618  420062 cri.go:89] found id: ""
	I1217 20:36:07.058637  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.058645  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:07.058650  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:07.058709  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:07.085932  420062 cri.go:89] found id: ""
	I1217 20:36:07.085946  420062 logs.go:282] 0 containers: []
	W1217 20:36:07.085953  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:07.085961  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:07.085972  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:07.100543  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:07.100561  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:07.162557  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:07.154011   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.154703   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.156341   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.157015   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.158662   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:07.154011   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.154703   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.156341   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.157015   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:07.158662   17022 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:07.162567  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:07.162578  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:07.226244  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:07.226265  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:07.280558  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:07.280574  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:09.844282  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:09.854593  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:09.854676  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:09.883180  420062 cri.go:89] found id: ""
	I1217 20:36:09.883194  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.883202  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:09.883208  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:09.883268  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:09.907225  420062 cri.go:89] found id: ""
	I1217 20:36:09.907240  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.907248  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:09.907254  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:09.907315  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:09.936079  420062 cri.go:89] found id: ""
	I1217 20:36:09.936093  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.936100  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:09.936105  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:09.936167  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:09.961921  420062 cri.go:89] found id: ""
	I1217 20:36:09.961935  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.961943  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:09.961949  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:09.962028  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:09.989285  420062 cri.go:89] found id: ""
	I1217 20:36:09.989299  420062 logs.go:282] 0 containers: []
	W1217 20:36:09.989307  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:09.989312  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:09.989371  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:10.023888  420062 cri.go:89] found id: ""
	I1217 20:36:10.023905  420062 logs.go:282] 0 containers: []
	W1217 20:36:10.023913  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:10.023920  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:10.023992  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:10.056062  420062 cri.go:89] found id: ""
	I1217 20:36:10.056077  420062 logs.go:282] 0 containers: []
	W1217 20:36:10.056084  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:10.056102  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:10.056112  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:10.118144  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:10.118165  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:10.153504  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:10.153521  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:10.209909  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:10.209931  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:10.224930  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:10.224946  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:10.310457  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:10.302878   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.303301   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304492   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304881   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.306456   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:10.302878   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.303301   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304492   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.304881   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:10.306456   17144 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:12.811296  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:12.821279  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:36:12.821339  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:36:12.845496  420062 cri.go:89] found id: ""
	I1217 20:36:12.845510  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.845519  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:36:12.845524  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:36:12.845582  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:36:12.873951  420062 cri.go:89] found id: ""
	I1217 20:36:12.873966  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.873973  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:36:12.873978  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:36:12.874039  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:36:12.898560  420062 cri.go:89] found id: ""
	I1217 20:36:12.898573  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.898580  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:36:12.898586  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:36:12.898661  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:36:12.931323  420062 cri.go:89] found id: ""
	I1217 20:36:12.931343  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.931350  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:36:12.931356  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:36:12.931416  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:36:12.957667  420062 cri.go:89] found id: ""
	I1217 20:36:12.957680  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.957687  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:36:12.957692  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:36:12.957749  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:36:12.981848  420062 cri.go:89] found id: ""
	I1217 20:36:12.981863  420062 logs.go:282] 0 containers: []
	W1217 20:36:12.981870  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:36:12.981876  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:36:12.981934  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:36:13.007649  420062 cri.go:89] found id: ""
	I1217 20:36:13.007664  420062 logs.go:282] 0 containers: []
	W1217 20:36:13.007671  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:36:13.007679  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:36:13.007689  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:36:13.070827  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:36:13.070846  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 20:36:13.098938  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:36:13.098954  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:36:13.155232  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:36:13.155253  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:36:13.170218  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:36:13.170234  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:36:13.237601  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:36:13.228684   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.229296   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.230990   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.231505   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.233204   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:36:13.228684   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.229296   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.230990   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.231505   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:36:13.233204   17248 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:36:15.739451  420062 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:36:15.749635  420062 kubeadm.go:602] duration metric: took 4m4.768391835s to restartPrimaryControlPlane
	W1217 20:36:15.749706  420062 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1217 20:36:15.749781  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1217 20:36:16.165425  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1217 20:36:16.179463  420062 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1217 20:36:16.187987  420062 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1217 20:36:16.188041  420062 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 20:36:16.195805  420062 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1217 20:36:16.195815  420062 kubeadm.go:158] found existing configuration files:
	
	I1217 20:36:16.195868  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1217 20:36:16.203578  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1217 20:36:16.203633  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1217 20:36:16.211222  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1217 20:36:16.218882  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1217 20:36:16.218939  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 20:36:16.226500  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1217 20:36:16.233980  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1217 20:36:16.234040  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 20:36:16.241486  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1217 20:36:16.250121  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1217 20:36:16.250177  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 20:36:16.257963  420062 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1217 20:36:16.296719  420062 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1217 20:36:16.297028  420062 kubeadm.go:319] [preflight] Running pre-flight checks
	I1217 20:36:16.367021  420062 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1217 20:36:16.367085  420062 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1217 20:36:16.367119  420062 kubeadm.go:319] OS: Linux
	I1217 20:36:16.367163  420062 kubeadm.go:319] CGROUPS_CPU: enabled
	I1217 20:36:16.367211  420062 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1217 20:36:16.367257  420062 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1217 20:36:16.367304  420062 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1217 20:36:16.367351  420062 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1217 20:36:16.367397  420062 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1217 20:36:16.367441  420062 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1217 20:36:16.367493  420062 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1217 20:36:16.367539  420062 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1217 20:36:16.443855  420062 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1217 20:36:16.443958  420062 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1217 20:36:16.444047  420062 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1217 20:36:16.456800  420062 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1217 20:36:16.459720  420062 out.go:252]   - Generating certificates and keys ...
	I1217 20:36:16.459808  420062 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1217 20:36:16.459875  420062 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1217 20:36:16.459957  420062 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1217 20:36:16.460026  420062 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1217 20:36:16.460100  420062 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1217 20:36:16.460156  420062 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1217 20:36:16.460222  420062 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1217 20:36:16.460299  420062 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1217 20:36:16.460377  420062 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1217 20:36:16.460454  420062 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1217 20:36:16.460493  420062 kubeadm.go:319] [certs] Using the existing "sa" key
	I1217 20:36:16.460552  420062 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1217 20:36:16.591707  420062 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1217 20:36:16.773515  420062 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1217 20:36:16.895942  420062 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1217 20:36:17.316963  420062 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1217 20:36:17.418134  420062 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1217 20:36:17.418872  420062 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1217 20:36:17.421748  420062 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1217 20:36:17.424898  420062 out.go:252]   - Booting up control plane ...
	I1217 20:36:17.424999  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1217 20:36:17.425075  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1217 20:36:17.425522  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1217 20:36:17.446706  420062 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1217 20:36:17.446809  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1217 20:36:17.455830  420062 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1217 20:36:17.455925  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1217 20:36:17.455963  420062 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1217 20:36:17.596746  420062 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1217 20:36:17.596869  420062 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1217 20:40:17.595000  420062 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000220112s
	I1217 20:40:17.595032  420062 kubeadm.go:319] 
	I1217 20:40:17.595086  420062 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1217 20:40:17.595116  420062 kubeadm.go:319] 	- The kubelet is not running
	I1217 20:40:17.595215  420062 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1217 20:40:17.595220  420062 kubeadm.go:319] 
	I1217 20:40:17.595317  420062 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1217 20:40:17.595346  420062 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1217 20:40:17.595375  420062 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1217 20:40:17.595378  420062 kubeadm.go:319] 
	I1217 20:40:17.599582  420062 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1217 20:40:17.600077  420062 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1217 20:40:17.600181  420062 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1217 20:40:17.600461  420062 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1217 20:40:17.600468  420062 kubeadm.go:319] 
	I1217 20:40:17.600540  420062 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1217 20:40:17.600694  420062 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000220112s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1217 20:40:17.600780  420062 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1217 20:40:18.014309  420062 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1217 20:40:18.029681  420062 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1217 20:40:18.029742  420062 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 20:40:18.038728  420062 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1217 20:40:18.038739  420062 kubeadm.go:158] found existing configuration files:
	
	I1217 20:40:18.038796  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1217 20:40:18.047726  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1217 20:40:18.047785  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1217 20:40:18.056139  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1217 20:40:18.064964  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1217 20:40:18.065020  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 20:40:18.073071  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1217 20:40:18.081347  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1217 20:40:18.081407  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 20:40:18.089386  420062 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1217 20:40:18.097546  420062 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1217 20:40:18.097608  420062 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 20:40:18.105445  420062 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1217 20:40:18.146508  420062 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1217 20:40:18.146883  420062 kubeadm.go:319] [preflight] Running pre-flight checks
	I1217 20:40:18.223079  420062 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1217 20:40:18.223139  420062 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1217 20:40:18.223171  420062 kubeadm.go:319] OS: Linux
	I1217 20:40:18.223212  420062 kubeadm.go:319] CGROUPS_CPU: enabled
	I1217 20:40:18.223257  420062 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1217 20:40:18.223306  420062 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1217 20:40:18.223354  420062 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1217 20:40:18.223398  420062 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1217 20:40:18.223442  420062 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1217 20:40:18.223484  420062 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1217 20:40:18.223529  420062 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1217 20:40:18.223571  420062 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1217 20:40:18.290116  420062 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1217 20:40:18.290214  420062 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1217 20:40:18.290297  420062 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1217 20:40:18.296827  420062 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1217 20:40:18.300313  420062 out.go:252]   - Generating certificates and keys ...
	I1217 20:40:18.300404  420062 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1217 20:40:18.300483  420062 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1217 20:40:18.300564  420062 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1217 20:40:18.300623  420062 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1217 20:40:18.300692  420062 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1217 20:40:18.300745  420062 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1217 20:40:18.300806  420062 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1217 20:40:18.300867  420062 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1217 20:40:18.300940  420062 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1217 20:40:18.301011  420062 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1217 20:40:18.301047  420062 kubeadm.go:319] [certs] Using the existing "sa" key
	I1217 20:40:18.301101  420062 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1217 20:40:18.651136  420062 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1217 20:40:18.865861  420062 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1217 20:40:19.156184  420062 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1217 20:40:19.613234  420062 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1217 20:40:19.777874  420062 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1217 20:40:19.778689  420062 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1217 20:40:19.781521  420062 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1217 20:40:19.784636  420062 out.go:252]   - Booting up control plane ...
	I1217 20:40:19.784726  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1217 20:40:19.784798  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1217 20:40:19.786110  420062 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1217 20:40:19.806173  420062 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1217 20:40:19.806463  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1217 20:40:19.814039  420062 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1217 20:40:19.814294  420062 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1217 20:40:19.814465  420062 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1217 20:40:19.960654  420062 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1217 20:40:19.960777  420062 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1217 20:44:19.954818  420062 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001239508s
	I1217 20:44:19.954843  420062 kubeadm.go:319] 
	I1217 20:44:19.954896  420062 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1217 20:44:19.954927  420062 kubeadm.go:319] 	- The kubelet is not running
	I1217 20:44:19.955102  420062 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1217 20:44:19.955108  420062 kubeadm.go:319] 
	I1217 20:44:19.955205  420062 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1217 20:44:19.955233  420062 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1217 20:44:19.955262  420062 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1217 20:44:19.955265  420062 kubeadm.go:319] 
	I1217 20:44:19.960153  420062 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1217 20:44:19.960582  420062 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1217 20:44:19.960689  420062 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1217 20:44:19.960924  420062 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1217 20:44:19.960929  420062 kubeadm.go:319] 
	I1217 20:44:19.960996  420062 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1217 20:44:19.961048  420062 kubeadm.go:403] duration metric: took 12m9.01968184s to StartCluster
	I1217 20:44:19.961079  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 20:44:19.961139  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 20:44:19.999166  420062 cri.go:89] found id: ""
	I1217 20:44:19.999182  420062 logs.go:282] 0 containers: []
	W1217 20:44:19.999190  420062 logs.go:284] No container was found matching "kube-apiserver"
	I1217 20:44:19.999195  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 20:44:19.999265  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 20:44:20.031203  420062 cri.go:89] found id: ""
	I1217 20:44:20.031218  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.031225  420062 logs.go:284] No container was found matching "etcd"
	I1217 20:44:20.031230  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 20:44:20.031293  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 20:44:20.061179  420062 cri.go:89] found id: ""
	I1217 20:44:20.061193  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.061200  420062 logs.go:284] No container was found matching "coredns"
	I1217 20:44:20.061219  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 20:44:20.061280  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 20:44:20.089093  420062 cri.go:89] found id: ""
	I1217 20:44:20.089107  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.089114  420062 logs.go:284] No container was found matching "kube-scheduler"
	I1217 20:44:20.089120  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 20:44:20.089183  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 20:44:20.119683  420062 cri.go:89] found id: ""
	I1217 20:44:20.119696  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.119704  420062 logs.go:284] No container was found matching "kube-proxy"
	I1217 20:44:20.119709  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 20:44:20.119772  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 20:44:20.145500  420062 cri.go:89] found id: ""
	I1217 20:44:20.145514  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.145521  420062 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 20:44:20.145526  420062 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 20:44:20.145586  420062 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 20:44:20.170345  420062 cri.go:89] found id: ""
	I1217 20:44:20.170359  420062 logs.go:282] 0 containers: []
	W1217 20:44:20.170367  420062 logs.go:284] No container was found matching "kindnet"
	I1217 20:44:20.170377  420062 logs.go:123] Gathering logs for kubelet ...
	I1217 20:44:20.170387  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 20:44:20.226476  420062 logs.go:123] Gathering logs for dmesg ...
	I1217 20:44:20.226496  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 20:44:20.241970  420062 logs.go:123] Gathering logs for describe nodes ...
	I1217 20:44:20.241987  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 20:44:20.311525  420062 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:44:20.302109   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.302950   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.304712   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.305374   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.307049   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1217 20:44:20.302109   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.302950   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.304712   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.305374   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:20.307049   21060 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 20:44:20.311535  420062 logs.go:123] Gathering logs for containerd ...
	I1217 20:44:20.311546  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 20:44:20.375759  420062 logs.go:123] Gathering logs for container status ...
	I1217 20:44:20.375781  420062 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1217 20:44:20.404823  420062 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001239508s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1217 20:44:20.404857  420062 out.go:285] * 
	W1217 20:44:20.404931  420062 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001239508s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1217 20:44:20.404948  420062 out.go:285] * 
	W1217 20:44:20.407052  420062 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1217 20:44:20.412138  420062 out.go:203] 
	W1217 20:44:20.415946  420062 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001239508s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1217 20:44:20.415994  420062 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1217 20:44:20.416018  420062 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1217 20:44:20.419093  420062 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304775544Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304836714Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304892469Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.304951784Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.305023309Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.305106805Z" level=info msg="Connect containerd service"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.305473562Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.306163145Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.318314045Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.318400322Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.318427285Z" level=info msg="Start subscribing containerd event"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.318481078Z" level=info msg="Start recovering state"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358031279Z" level=info msg="Start event monitor"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358217808Z" level=info msg="Start cni network conf syncer for default"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358291688Z" level=info msg="Start streaming server"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358359291Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358415021Z" level=info msg="runtime interface starting up..."
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358467600Z" level=info msg="starting plugins..."
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.358529204Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 17 20:32:09 functional-682596 systemd[1]: Started containerd.service - containerd container runtime.
	Dec 17 20:32:09 functional-682596 containerd[9792]: time="2025-12-17T20:32:09.361000854Z" level=info msg="containerd successfully booted in 0.082346s"
	Dec 17 20:44:29 functional-682596 containerd[9792]: time="2025-12-17T20:44:29.467315405Z" level=info msg="No images store for sha256:426b8c85f3639ce7684f335da56e517a857cd0c0b418e28f3fce1e3079a57b26"
	Dec 17 20:44:29 functional-682596 containerd[9792]: time="2025-12-17T20:44:29.472197664Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-682596\""
	Dec 17 20:44:29 functional-682596 containerd[9792]: time="2025-12-17T20:44:29.485251105Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 20:44:29 functional-682596 containerd[9792]: time="2025-12-17T20:44:29.485700024Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-682596\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1217 20:44:30.391344   21824 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:30.392083   21824 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:30.393725   21824 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:30.394384   21824 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1217 20:44:30.396064   21824 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec17 17:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015536] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.514164] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034184] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.806183] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.649674] kauditd_printk_skb: 36 callbacks suppressed
	[Dec17 19:37] hrtimer: interrupt took 15014583 ns
	[Dec17 19:39] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:06] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:17] FS-Cache: Duplicate cookie detected
	[  +0.000767] FS-Cache: O-cookie c=00000031 [p=00000002 fl=222 nc=0 na=1]
	[  +0.001036] FS-Cache: O-cookie d=00000000b1f70094{9P.session} n=000000004124fba5
	[  +0.001177] FS-Cache: O-key=[10] '34323937353834383437'
	[  +0.000816] FS-Cache: N-cookie c=00000032 [p=00000002 fl=2 nc=0 na=1]
	[  +0.001043] FS-Cache: N-cookie d=00000000b1f70094{9P.session} n=000000009cece4cf
	[  +0.001160] FS-Cache: N-key=[10] '34323937353834383437'
	
	
	==> kernel <==
	 20:44:30 up  3:26,  0 user,  load average: 1.10, 0.37, 0.50
	Linux functional-682596 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 17 20:44:26 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:44:27 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 330.
	Dec 17 20:44:27 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:27 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:27 functional-682596 kubelet[21578]: E1217 20:44:27.561025   21578 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:44:27 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:44:27 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:44:28 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 331.
	Dec 17 20:44:28 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:28 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:28 functional-682596 kubelet[21633]: E1217 20:44:28.301773   21633 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:44:28 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:44:28 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:44:28 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 332.
	Dec 17 20:44:28 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:28 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:29 functional-682596 kubelet[21669]: E1217 20:44:29.095366   21669 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:44:29 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:44:29 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 20:44:29 functional-682596 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 333.
	Dec 17 20:44:29 functional-682596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:29 functional-682596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 20:44:29 functional-682596 kubelet[21733]: E1217 20:44:29.857811   21733 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 20:44:29 functional-682596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 20:44:29 functional-682596 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-682596 -n functional-682596: exit status 2 (463.920283ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-682596" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels (3.13s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/DeployApp (0.07s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/DeployApp
functional_test.go:1451: (dbg) Run:  kubectl --context functional-682596 create deployment hello-node --image kicbase/echo-server
functional_test.go:1451: (dbg) Non-zero exit: kubectl --context functional-682596 create deployment hello-node --image kicbase/echo-server: exit status 1 (73.592437ms)

                                                
                                                
** stderr ** 
	error: failed to create deployment: Post "https://192.168.49.2:8441/apis/apps/v1/namespaces/default/deployments?fieldManager=kubectl-create&fieldValidation=Strict": dial tcp 192.168.49.2:8441: connect: connection refused

                                                
                                                
** /stderr **
functional_test.go:1453: failed to create hello-node deployment with this command "kubectl --context functional-682596 create deployment hello-node --image kicbase/echo-server": exit status 1.
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/DeployApp (0.07s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/List (0.36s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/List
functional_test.go:1469: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 service list
functional_test.go:1469: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 service list: exit status 103 (356.537389ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-682596 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-682596"

                                                
                                                
-- /stdout --
functional_test.go:1471: failed to do service list. args "out/minikube-linux-arm64 -p functional-682596 service list" : exit status 103
functional_test.go:1474: expected 'service list' to contain *hello-node* but got -"* The control-plane node functional-682596 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-682596\"\n"-
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/List (0.36s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/JSONOutput (0.32s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/JSONOutput
functional_test.go:1499: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 service list -o json
functional_test.go:1499: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 service list -o json: exit status 103 (317.421065ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-682596 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-682596"

                                                
                                                
-- /stdout --
functional_test.go:1501: failed to list services with json format. args "out/minikube-linux-arm64 -p functional-682596 service list -o json": exit status 103
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/JSONOutput (0.32s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/HTTPS (0.35s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/HTTPS
functional_test.go:1519: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 service --namespace=default --https --url hello-node
functional_test.go:1519: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 service --namespace=default --https --url hello-node: exit status 103 (350.327644ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-682596 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-682596"

                                                
                                                
-- /stdout --
functional_test.go:1521: failed to get service url. args "out/minikube-linux-arm64 -p functional-682596 service --namespace=default --https --url hello-node" : exit status 103
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/HTTPS (0.35s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/Format (0.34s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/Format
functional_test.go:1550: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 service hello-node --url --format={{.IP}}
functional_test.go:1550: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 service hello-node --url --format={{.IP}}: exit status 103 (337.057469ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-682596 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-682596"

                                                
                                                
-- /stdout --
functional_test.go:1552: failed to get service url with custom format. args "out/minikube-linux-arm64 -p functional-682596 service hello-node --url --format={{.IP}}": exit status 103
functional_test.go:1558: "* The control-plane node functional-682596 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-682596\"" is not a valid IP
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/Format (0.34s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/URL (0.34s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/URL
functional_test.go:1569: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 service hello-node --url
functional_test.go:1569: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 service hello-node --url: exit status 103 (344.151264ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-682596 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-682596"

                                                
                                                
-- /stdout --
functional_test.go:1571: failed to get service url. args: "out/minikube-linux-arm64 -p functional-682596 service hello-node --url": exit status 103
functional_test.go:1575: found endpoint for hello-node: * The control-plane node functional-682596 apiserver is not running: (state=Stopped)
To start a cluster, run: "minikube start -p functional-682596"
functional_test.go:1579: failed to parse "* The control-plane node functional-682596 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-682596\"": parse "* The control-plane node functional-682596 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-682596\"": net/url: invalid control character in URL
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/URL (0.34s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/RunSecondTunnel (0.49s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-682596 tunnel --alsologtostderr]
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-682596 tunnel --alsologtostderr]
functional_test_tunnel_test.go:190: tunnel command failed with unexpected error: exit code 103. stderr: I1217 20:44:36.030387  435196 out.go:360] Setting OutFile to fd 1 ...
I1217 20:44:36.030832  435196 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:44:36.030842  435196 out.go:374] Setting ErrFile to fd 2...
I1217 20:44:36.030848  435196 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:44:36.031146  435196 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
I1217 20:44:36.031429  435196 mustload.go:66] Loading cluster: functional-682596
I1217 20:44:36.031910  435196 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1217 20:44:36.032441  435196 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
I1217 20:44:36.063494  435196 host.go:66] Checking if "functional-682596" exists ...
I1217 20:44:36.063848  435196 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1217 20:44:36.176948  435196 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 20:44:36.163124623 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1217 20:44:36.177073  435196 api_server.go:166] Checking apiserver status ...
I1217 20:44:36.177135  435196 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
I1217 20:44:36.177171  435196 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
I1217 20:44:36.205041  435196 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
W1217 20:44:36.321312  435196 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
stdout:

                                                
                                                
stderr:
I1217 20:44:36.324686  435196 out.go:179] * The control-plane node functional-682596 apiserver is not running: (state=Stopped)
I1217 20:44:36.327948  435196 out.go:179]   To start a cluster, run: "minikube start -p functional-682596"

                                                
                                                
stdout: * The control-plane node functional-682596 apiserver is not running: (state=Stopped)
To start a cluster, run: "minikube start -p functional-682596"
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-682596 tunnel --alsologtostderr] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
functional_test_tunnel_test.go:194: read stdout failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-682596 tunnel --alsologtostderr] stdout:
functional_test_tunnel_test.go:194: read stderr failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-682596 tunnel --alsologtostderr] stderr:
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-682596 tunnel --alsologtostderr] ...
helpers_test.go:526: unable to kill pid 435195: os: process already finished
functional_test_tunnel_test.go:194: read stdout failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-682596 tunnel --alsologtostderr] stdout:
functional_test_tunnel_test.go:194: read stderr failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-682596 tunnel --alsologtostderr] stderr:
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/RunSecondTunnel (0.49s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/WaitService/Setup (0.15s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:212: (dbg) Run:  kubectl --context functional-682596 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:212: (dbg) Non-zero exit: kubectl --context functional-682596 apply -f testdata/testsvc.yaml: exit status 1 (147.093538ms)

                                                
                                                
** stderr ** 
	error: error validating "testdata/testsvc.yaml": error validating data: failed to download openapi: Get "https://192.168.49.2:8441/openapi/v2?timeout=32s": dial tcp 192.168.49.2:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false

                                                
                                                
** /stderr **
functional_test_tunnel_test.go:214: kubectl --context functional-682596 apply -f testdata/testsvc.yaml failed: exit status 1
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/WaitService/Setup (0.15s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessDirect (129.03s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:288: failed to hit nginx at "http://10.106.1.178": Temporary Error: Get "http://10.106.1.178": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
functional_test_tunnel_test.go:290: (dbg) Run:  kubectl --context functional-682596 get svc nginx-svc
functional_test_tunnel_test.go:290: (dbg) Non-zero exit: kubectl --context functional-682596 get svc nginx-svc: exit status 1 (64.301633ms)

                                                
                                                
** stderr ** 
	E1217 20:46:45.623321  436618 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:46:45.624889  436618 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:46:45.626386  436618 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:46:45.627875  436618 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	E1217 20:46:45.629390  436618 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test_tunnel_test.go:292: kubectl --context functional-682596 get svc nginx-svc failed: exit status 1
functional_test_tunnel_test.go:294: failed to kubectl get svc nginx-svc:
functional_test_tunnel_test.go:301: expected body to contain "Welcome to nginx!", but got *""*
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessDirect (129.03s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/any-port (2.36s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/any-port
functional_test_mount_test.go:73: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3220205391/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:107: wrote "test-1766004412662439027" to /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3220205391/001/created-by-test
functional_test_mount_test.go:107: wrote "test-1766004412662439027" to /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3220205391/001/created-by-test-removed-by-pod
functional_test_mount_test.go:107: wrote "test-1766004412662439027" to /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3220205391/001/test-1766004412662439027
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:115: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (378.305391ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1217 20:46:53.041021  369461 retry.go:31] will retry after 493.197229ms: exit status 1
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:129: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh -- ls -la /mount-9p
functional_test_mount_test.go:133: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Dec 17 20:46 created-by-test
-rw-r--r-- 1 docker docker 24 Dec 17 20:46 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Dec 17 20:46 test-1766004412662439027
functional_test_mount_test.go:137: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh cat /mount-9p/test-1766004412662439027
functional_test_mount_test.go:148: (dbg) Run:  kubectl --context functional-682596 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:148: (dbg) Non-zero exit: kubectl --context functional-682596 replace --force -f testdata/busybox-mount-test.yaml: exit status 1 (53.797602ms)

                                                
                                                
** stderr ** 
	E1217 20:46:54.383740  438091 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://192.168.49.2:8441/api?timeout=32s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	error: unable to recognize "testdata/busybox-mount-test.yaml": Get "https://192.168.49.2:8441/api?timeout=32s": dial tcp 192.168.49.2:8441: connect: connection refused

                                                
                                                
** /stderr **
functional_test_mount_test.go:150: failed to 'kubectl replace' for busybox-mount-test. args "kubectl --context functional-682596 replace --force -f testdata/busybox-mount-test.yaml" : exit status 1
functional_test_mount_test.go:80: "TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/any-port" failed, getting debug info...
functional_test_mount_test.go:81: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates"
functional_test_mount_test.go:81: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 ssh "mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates": exit status 1 (258.394324ms)

                                                
                                                
-- stdout --
	192.168.49.1 on /mount-9p type 9p (rw,relatime,sync,dirsync,dfltuid=1000,dfltgid=997,access=any,msize=262144,trans=tcp,noextend,port=42105)
	total 2
	-rw-r--r-- 1 docker docker 24 Dec 17 20:46 created-by-test
	-rw-r--r-- 1 docker docker 24 Dec 17 20:46 created-by-test-removed-by-pod
	-rw-r--r-- 1 docker docker 24 Dec 17 20:46 test-1766004412662439027
	cat: /mount-9p/pod-dates: No such file or directory

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:83: debugging command "out/minikube-linux-arm64 -p functional-682596 ssh \"mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates\"" failed : exit status 1
functional_test_mount_test.go:90: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:94: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3220205391/001:/mount-9p --alsologtostderr -v=1] ...
functional_test_mount_test.go:94: (dbg) [out/minikube-linux-arm64 mount -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3220205391/001:/mount-9p --alsologtostderr -v=1] stdout:
* Mounting host path /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3220205391/001 into VM as /mount-9p ...
- Mount type:   9p
- User ID:      docker
- Group ID:     docker
- Version:      9p2000.L
- Message Size: 262144
- Options:      map[]
- Bind Address: 192.168.49.1:42105
* Userspace file server: 
ufs starting
* Successfully mounted /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3220205391/001 to /mount-9p

                                                
                                                
* NOTE: This process must stay alive for the mount to be accessible ...
* Unmounting /mount-9p ...

                                                
                                                

                                                
                                                
functional_test_mount_test.go:94: (dbg) [out/minikube-linux-arm64 mount -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3220205391/001:/mount-9p --alsologtostderr -v=1] stderr:
I1217 20:46:52.720031  437744 out.go:360] Setting OutFile to fd 1 ...
I1217 20:46:52.720281  437744 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:46:52.720290  437744 out.go:374] Setting ErrFile to fd 2...
I1217 20:46:52.720295  437744 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:46:52.720558  437744 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
I1217 20:46:52.720847  437744 mustload.go:66] Loading cluster: functional-682596
I1217 20:46:52.721225  437744 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1217 20:46:52.721756  437744 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
I1217 20:46:52.754478  437744 host.go:66] Checking if "functional-682596" exists ...
I1217 20:46:52.754872  437744 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1217 20:46:52.874130  437744 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 20:46:52.864436192 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1217 20:46:52.874290  437744 cli_runner.go:164] Run: docker network inspect functional-682596 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
I1217 20:46:52.904556  437744 out.go:179] * Mounting host path /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3220205391/001 into VM as /mount-9p ...
I1217 20:46:52.907505  437744 out.go:179]   - Mount type:   9p
I1217 20:46:52.910811  437744 out.go:179]   - User ID:      docker
I1217 20:46:52.913799  437744 out.go:179]   - Group ID:     docker
I1217 20:46:52.916659  437744 out.go:179]   - Version:      9p2000.L
I1217 20:46:52.919500  437744 out.go:179]   - Message Size: 262144
I1217 20:46:52.922372  437744 out.go:179]   - Options:      map[]
I1217 20:46:52.925196  437744 out.go:179]   - Bind Address: 192.168.49.1:42105
I1217 20:46:52.927962  437744 out.go:179] * Userspace file server: 
I1217 20:46:52.928317  437744 ssh_runner.go:195] Run: /bin/bash -c "[ "x$(findmnt -T /mount-9p | grep /mount-9p)" != "x" ] && sudo umount -f -l /mount-9p || echo "
I1217 20:46:52.928429  437744 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
I1217 20:46:52.957449  437744 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
I1217 20:46:53.055096  437744 mount.go:180] unmount for /mount-9p ran successfully
I1217 20:46:53.055123  437744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /mount-9p"
I1217 20:46:53.064370  437744 ssh_runner.go:195] Run: /bin/bash -c "sudo mount -t 9p -o dfltgid=$(grep ^docker: /etc/group | cut -d: -f3),dfltuid=$(id -u docker),msize=262144,port=42105,trans=tcp,version=9p2000.L 192.168.49.1 /mount-9p"
I1217 20:46:53.075404  437744 main.go:127] stdlog: ufs.go:141 connected
I1217 20:46:53.075570  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tversion tag 65535 msize 262144 version '9P2000.L'
I1217 20:46:53.075611  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rversion tag 65535 msize 262144 version '9P2000'
I1217 20:46:53.075837  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tattach tag 0 fid 0 afid 4294967295 uname 'nobody' nuname 0 aname ''
I1217 20:46:53.075901  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rattach tag 0 aqid (15c3b1d 2e10d0f4 'd')
I1217 20:46:53.076662  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tstat tag 0 fid 0
I1217 20:46:53.076734  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (15c3b1d 2e10d0f4 'd') m d775 at 0 mt 1766004412 l 4096 t 0 d 0 ext )
I1217 20:46:53.080907  437744 lock.go:50] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/.mount-process: {Name:mkcea25327ef701879e518d3f141ffafdb4f6a33 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
I1217 20:46:53.081119  437744 mount.go:105] mount successful: ""
I1217 20:46:53.084668  437744 out.go:179] * Successfully mounted /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3220205391/001 to /mount-9p
I1217 20:46:53.087613  437744 out.go:203] 
I1217 20:46:53.090592  437744 out.go:179] * NOTE: This process must stay alive for the mount to be accessible ...
I1217 20:46:54.044750  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tstat tag 0 fid 0
I1217 20:46:54.044850  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (15c3b1d 2e10d0f4 'd') m d775 at 0 mt 1766004412 l 4096 t 0 d 0 ext )
I1217 20:46:54.045196  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Twalk tag 0 fid 0 newfid 1 
I1217 20:46:54.045230  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rwalk tag 0 
I1217 20:46:54.045407  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Topen tag 0 fid 1 mode 0
I1217 20:46:54.045476  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Ropen tag 0 qid (15c3b1d 2e10d0f4 'd') iounit 0
I1217 20:46:54.045621  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tstat tag 0 fid 0
I1217 20:46:54.045662  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (15c3b1d 2e10d0f4 'd') m d775 at 0 mt 1766004412 l 4096 t 0 d 0 ext )
I1217 20:46:54.045816  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tread tag 0 fid 1 offset 0 count 262120
I1217 20:46:54.045942  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rread tag 0 count 258
I1217 20:46:54.046076  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tread tag 0 fid 1 offset 258 count 261862
I1217 20:46:54.046114  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rread tag 0 count 0
I1217 20:46:54.046244  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tread tag 0 fid 1 offset 258 count 262120
I1217 20:46:54.046271  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rread tag 0 count 0
I1217 20:46:54.046405  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Twalk tag 0 fid 0 newfid 2 0:'created-by-test' 
I1217 20:46:54.046463  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rwalk tag 0 (15c3b1e 2e10d0f4 '') 
I1217 20:46:54.046594  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tstat tag 0 fid 2
I1217 20:46:54.046631  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (15c3b1e 2e10d0f4 '') m 644 at 0 mt 1766004412 l 24 t 0 d 0 ext )
I1217 20:46:54.046758  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tstat tag 0 fid 2
I1217 20:46:54.046798  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (15c3b1e 2e10d0f4 '') m 644 at 0 mt 1766004412 l 24 t 0 d 0 ext )
I1217 20:46:54.046926  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tclunk tag 0 fid 2
I1217 20:46:54.046951  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rclunk tag 0
I1217 20:46:54.047080  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Twalk tag 0 fid 0 newfid 2 0:'test-1766004412662439027' 
I1217 20:46:54.047115  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rwalk tag 0 (15c3b20 2e10d0f4 '') 
I1217 20:46:54.047244  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tstat tag 0 fid 2
I1217 20:46:54.047288  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rstat tag 0 st ('test-1766004412662439027' 'jenkins' 'jenkins' '' q (15c3b20 2e10d0f4 '') m 644 at 0 mt 1766004412 l 24 t 0 d 0 ext )
I1217 20:46:54.047428  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tstat tag 0 fid 2
I1217 20:46:54.047468  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rstat tag 0 st ('test-1766004412662439027' 'jenkins' 'jenkins' '' q (15c3b20 2e10d0f4 '') m 644 at 0 mt 1766004412 l 24 t 0 d 0 ext )
I1217 20:46:54.047602  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tclunk tag 0 fid 2
I1217 20:46:54.047625  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rclunk tag 0
I1217 20:46:54.047768  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Twalk tag 0 fid 0 newfid 2 0:'created-by-test-removed-by-pod' 
I1217 20:46:54.047803  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rwalk tag 0 (15c3b1f 2e10d0f4 '') 
I1217 20:46:54.047935  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tstat tag 0 fid 2
I1217 20:46:54.047971  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (15c3b1f 2e10d0f4 '') m 644 at 0 mt 1766004412 l 24 t 0 d 0 ext )
I1217 20:46:54.048087  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tstat tag 0 fid 2
I1217 20:46:54.048123  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (15c3b1f 2e10d0f4 '') m 644 at 0 mt 1766004412 l 24 t 0 d 0 ext )
I1217 20:46:54.048244  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tclunk tag 0 fid 2
I1217 20:46:54.048287  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rclunk tag 0
I1217 20:46:54.048405  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tread tag 0 fid 1 offset 258 count 262120
I1217 20:46:54.048438  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rread tag 0 count 0
I1217 20:46:54.048584  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tclunk tag 0 fid 1
I1217 20:46:54.048628  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rclunk tag 0
I1217 20:46:54.316575  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Twalk tag 0 fid 0 newfid 1 0:'test-1766004412662439027' 
I1217 20:46:54.316651  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rwalk tag 0 (15c3b20 2e10d0f4 '') 
I1217 20:46:54.316826  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tstat tag 0 fid 1
I1217 20:46:54.316879  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rstat tag 0 st ('test-1766004412662439027' 'jenkins' 'jenkins' '' q (15c3b20 2e10d0f4 '') m 644 at 0 mt 1766004412 l 24 t 0 d 0 ext )
I1217 20:46:54.317039  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Twalk tag 0 fid 1 newfid 2 
I1217 20:46:54.317068  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rwalk tag 0 
I1217 20:46:54.317361  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Topen tag 0 fid 2 mode 0
I1217 20:46:54.317451  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Ropen tag 0 qid (15c3b20 2e10d0f4 '') iounit 0
I1217 20:46:54.317639  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tstat tag 0 fid 1
I1217 20:46:54.317689  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rstat tag 0 st ('test-1766004412662439027' 'jenkins' 'jenkins' '' q (15c3b20 2e10d0f4 '') m 644 at 0 mt 1766004412 l 24 t 0 d 0 ext )
I1217 20:46:54.317866  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tread tag 0 fid 2 offset 0 count 262120
I1217 20:46:54.317917  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rread tag 0 count 24
I1217 20:46:54.318036  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tread tag 0 fid 2 offset 24 count 262120
I1217 20:46:54.318066  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rread tag 0 count 0
I1217 20:46:54.318228  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tread tag 0 fid 2 offset 24 count 262120
I1217 20:46:54.318278  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rread tag 0 count 0
I1217 20:46:54.318509  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tclunk tag 0 fid 2
I1217 20:46:54.318542  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rclunk tag 0
I1217 20:46:54.318713  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tclunk tag 0 fid 1
I1217 20:46:54.318751  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rclunk tag 0
I1217 20:46:54.635473  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tstat tag 0 fid 0
I1217 20:46:54.635547  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (15c3b1d 2e10d0f4 'd') m d775 at 0 mt 1766004412 l 4096 t 0 d 0 ext )
I1217 20:46:54.635903  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Twalk tag 0 fid 0 newfid 1 
I1217 20:46:54.635939  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rwalk tag 0 
I1217 20:46:54.636059  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Topen tag 0 fid 1 mode 0
I1217 20:46:54.636109  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Ropen tag 0 qid (15c3b1d 2e10d0f4 'd') iounit 0
I1217 20:46:54.636295  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tstat tag 0 fid 0
I1217 20:46:54.636352  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (15c3b1d 2e10d0f4 'd') m d775 at 0 mt 1766004412 l 4096 t 0 d 0 ext )
I1217 20:46:54.636550  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tread tag 0 fid 1 offset 0 count 262120
I1217 20:46:54.636680  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rread tag 0 count 258
I1217 20:46:54.636822  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tread tag 0 fid 1 offset 258 count 261862
I1217 20:46:54.636871  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rread tag 0 count 0
I1217 20:46:54.636999  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tread tag 0 fid 1 offset 258 count 262120
I1217 20:46:54.637037  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rread tag 0 count 0
I1217 20:46:54.637177  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Twalk tag 0 fid 0 newfid 2 0:'created-by-test' 
I1217 20:46:54.637225  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rwalk tag 0 (15c3b1e 2e10d0f4 '') 
I1217 20:46:54.637341  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tstat tag 0 fid 2
I1217 20:46:54.637384  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (15c3b1e 2e10d0f4 '') m 644 at 0 mt 1766004412 l 24 t 0 d 0 ext )
I1217 20:46:54.637524  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tstat tag 0 fid 2
I1217 20:46:54.637565  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (15c3b1e 2e10d0f4 '') m 644 at 0 mt 1766004412 l 24 t 0 d 0 ext )
I1217 20:46:54.637691  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tclunk tag 0 fid 2
I1217 20:46:54.637717  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rclunk tag 0
I1217 20:46:54.637866  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Twalk tag 0 fid 0 newfid 2 0:'test-1766004412662439027' 
I1217 20:46:54.637899  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rwalk tag 0 (15c3b20 2e10d0f4 '') 
I1217 20:46:54.638011  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tstat tag 0 fid 2
I1217 20:46:54.638045  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rstat tag 0 st ('test-1766004412662439027' 'jenkins' 'jenkins' '' q (15c3b20 2e10d0f4 '') m 644 at 0 mt 1766004412 l 24 t 0 d 0 ext )
I1217 20:46:54.638188  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tstat tag 0 fid 2
I1217 20:46:54.638220  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rstat tag 0 st ('test-1766004412662439027' 'jenkins' 'jenkins' '' q (15c3b20 2e10d0f4 '') m 644 at 0 mt 1766004412 l 24 t 0 d 0 ext )
I1217 20:46:54.638334  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tclunk tag 0 fid 2
I1217 20:46:54.638356  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rclunk tag 0
I1217 20:46:54.638494  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Twalk tag 0 fid 0 newfid 2 0:'created-by-test-removed-by-pod' 
I1217 20:46:54.638526  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rwalk tag 0 (15c3b1f 2e10d0f4 '') 
I1217 20:46:54.638639  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tstat tag 0 fid 2
I1217 20:46:54.638678  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (15c3b1f 2e10d0f4 '') m 644 at 0 mt 1766004412 l 24 t 0 d 0 ext )
I1217 20:46:54.638797  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tstat tag 0 fid 2
I1217 20:46:54.638835  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (15c3b1f 2e10d0f4 '') m 644 at 0 mt 1766004412 l 24 t 0 d 0 ext )
I1217 20:46:54.638973  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tclunk tag 0 fid 2
I1217 20:46:54.638998  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rclunk tag 0
I1217 20:46:54.639113  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tread tag 0 fid 1 offset 258 count 262120
I1217 20:46:54.639143  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rread tag 0 count 0
I1217 20:46:54.639285  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tclunk tag 0 fid 1
I1217 20:46:54.639316  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rclunk tag 0
I1217 20:46:54.640523  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Twalk tag 0 fid 0 newfid 1 0:'pod-dates' 
I1217 20:46:54.640590  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rerror tag 0 ename 'file not found' ecode 0
I1217 20:46:54.901500  437744 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:51662 Tclunk tag 0 fid 0
I1217 20:46:54.901549  437744 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:51662 Rclunk tag 0
I1217 20:46:54.902725  437744 main.go:127] stdlog: ufs.go:147 disconnected
I1217 20:46:54.928322  437744 out.go:179] * Unmounting /mount-9p ...
I1217 20:46:54.931206  437744 ssh_runner.go:195] Run: /bin/bash -c "[ "x$(findmnt -T /mount-9p | grep /mount-9p)" != "x" ] && sudo umount -f -l /mount-9p || echo "
I1217 20:46:54.939034  437744 mount.go:180] unmount for /mount-9p ran successfully
I1217 20:46:54.939146  437744 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/.mount-process: {Name:mkcea25327ef701879e518d3f141ffafdb4f6a33 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
I1217 20:46:54.942302  437744 out.go:203] 
W1217 20:46:54.945258  437744 out.go:285] X Exiting due to MK_INTERRUPTED: Received terminated signal
X Exiting due to MK_INTERRUPTED: Received terminated signal
I1217 20:46:54.948121  437744 out.go:203] 
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/any-port (2.36s)

                                                
                                    
x
+
TestKubernetesUpgrade (794.69s)

                                                
                                                
=== RUN   TestKubernetesUpgrade
=== PAUSE TestKubernetesUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:222: (dbg) Run:  out/minikube-linux-arm64 start -p kubernetes-upgrade-332113 --memory=3072 --kubernetes-version=v1.28.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
version_upgrade_test.go:222: (dbg) Done: out/minikube-linux-arm64 start -p kubernetes-upgrade-332113 --memory=3072 --kubernetes-version=v1.28.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (34.047469275s)
version_upgrade_test.go:227: (dbg) Run:  out/minikube-linux-arm64 stop -p kubernetes-upgrade-332113
version_upgrade_test.go:227: (dbg) Done: out/minikube-linux-arm64 stop -p kubernetes-upgrade-332113: (1.358225622s)
version_upgrade_test.go:232: (dbg) Run:  out/minikube-linux-arm64 -p kubernetes-upgrade-332113 status --format={{.Host}}
version_upgrade_test.go:232: (dbg) Non-zero exit: out/minikube-linux-arm64 -p kubernetes-upgrade-332113 status --format={{.Host}}: exit status 7 (76.763744ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
version_upgrade_test.go:234: status error: exit status 7 (may be ok)
version_upgrade_test.go:243: (dbg) Run:  out/minikube-linux-arm64 start -p kubernetes-upgrade-332113 --memory=3072 --kubernetes-version=v1.35.0-rc.1 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
version_upgrade_test.go:243: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p kubernetes-upgrade-332113 --memory=3072 --kubernetes-version=v1.35.0-rc.1 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: exit status 109 (12m34.234340967s)

                                                
                                                
-- stdout --
	* [kubernetes-upgrade-332113] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21808
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	* Starting "kubernetes-upgrade-332113" primary control-plane node in "kubernetes-upgrade-332113" cluster
	* Pulling base image v0.0.48-1765661130-22141 ...
	* Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1217 21:18:49.491561  588228 out.go:360] Setting OutFile to fd 1 ...
	I1217 21:18:49.491680  588228 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 21:18:49.491685  588228 out.go:374] Setting ErrFile to fd 2...
	I1217 21:18:49.491690  588228 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 21:18:49.491961  588228 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 21:18:49.492388  588228 out.go:368] Setting JSON to false
	I1217 21:18:49.493357  588228 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":14475,"bootTime":1765991855,"procs":194,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 21:18:49.493430  588228 start.go:143] virtualization:  
	I1217 21:18:49.498640  588228 out.go:179] * [kubernetes-upgrade-332113] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1217 21:18:49.501744  588228 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 21:18:49.501812  588228 notify.go:221] Checking for updates...
	I1217 21:18:49.510030  588228 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 21:18:49.513062  588228 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 21:18:49.516125  588228 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 21:18:49.519220  588228 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 21:18:49.522364  588228 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 21:18:49.525978  588228 config.go:182] Loaded profile config "kubernetes-upgrade-332113": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.28.0
	I1217 21:18:49.526604  588228 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 21:18:49.551281  588228 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 21:18:49.551407  588228 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 21:18:49.622762  588228 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:2 ContainersRunning:1 ContainersPaused:0 ContainersStopped:1 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 21:18:49.612801162 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 21:18:49.622868  588228 docker.go:319] overlay module found
	I1217 21:18:49.626118  588228 out.go:179] * Using the docker driver based on existing profile
	I1217 21:18:49.628941  588228 start.go:309] selected driver: docker
	I1217 21:18:49.628989  588228 start.go:927] validating driver "docker" against &{Name:kubernetes-upgrade-332113 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.0 ClusterName:kubernetes-upgrade-332113 Namespace:default APIServerHA
VIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.28.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Cu
stomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 21:18:49.629123  588228 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 21:18:49.629838  588228 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 21:18:49.691162  588228 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:2 ContainersRunning:1 ContainersPaused:0 ContainersStopped:1 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 21:18:49.682016252 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 21:18:49.691511  588228 cni.go:84] Creating CNI manager for ""
	I1217 21:18:49.691571  588228 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 21:18:49.691605  588228 start.go:353] cluster config:
	{Name:kubernetes-upgrade-332113 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:kubernetes-upgrade-332113 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:c
luster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP
: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 21:18:49.694721  588228 out.go:179] * Starting "kubernetes-upgrade-332113" primary control-plane node in "kubernetes-upgrade-332113" cluster
	I1217 21:18:49.697506  588228 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1217 21:18:49.700602  588228 out.go:179] * Pulling base image v0.0.48-1765661130-22141 ...
	I1217 21:18:49.703583  588228 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 21:18:49.703636  588228 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1217 21:18:49.703647  588228 cache.go:65] Caching tarball of preloaded images
	I1217 21:18:49.703684  588228 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon
	I1217 21:18:49.703740  588228 preload.go:238] Found /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1217 21:18:49.703751  588228 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1217 21:18:49.703871  588228 profile.go:143] Saving config to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/kubernetes-upgrade-332113/config.json ...
	I1217 21:18:49.725456  588228 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon, skipping pull
	I1217 21:18:49.725479  588228 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 exists in daemon, skipping load
	I1217 21:18:49.725499  588228 cache.go:243] Successfully downloaded all kic artifacts
	I1217 21:18:49.725526  588228 start.go:360] acquireMachinesLock for kubernetes-upgrade-332113: {Name:mkf3c35f4f6e75220f36467aa1d63c5e20b09eef Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1217 21:18:49.725592  588228 start.go:364] duration metric: took 43.808µs to acquireMachinesLock for "kubernetes-upgrade-332113"
	I1217 21:18:49.725616  588228 start.go:96] Skipping create...Using existing machine configuration
	I1217 21:18:49.725625  588228 fix.go:54] fixHost starting: 
	I1217 21:18:49.725889  588228 cli_runner.go:164] Run: docker container inspect kubernetes-upgrade-332113 --format={{.State.Status}}
	I1217 21:18:49.744274  588228 fix.go:112] recreateIfNeeded on kubernetes-upgrade-332113: state=Stopped err=<nil>
	W1217 21:18:49.744303  588228 fix.go:138] unexpected machine state, will restart: <nil>
	I1217 21:18:49.747460  588228 out.go:252] * Restarting existing docker container for "kubernetes-upgrade-332113" ...
	I1217 21:18:49.747557  588228 cli_runner.go:164] Run: docker start kubernetes-upgrade-332113
	I1217 21:18:50.003942  588228 cli_runner.go:164] Run: docker container inspect kubernetes-upgrade-332113 --format={{.State.Status}}
	I1217 21:18:50.033154  588228 kic.go:430] container "kubernetes-upgrade-332113" state is running.
	I1217 21:18:50.033582  588228 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" kubernetes-upgrade-332113
	I1217 21:18:50.055909  588228 profile.go:143] Saving config to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/kubernetes-upgrade-332113/config.json ...
	I1217 21:18:50.056216  588228 machine.go:94] provisionDockerMachine start ...
	I1217 21:18:50.056349  588228 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-332113
	I1217 21:18:50.084159  588228 main.go:143] libmachine: Using SSH client type: native
	I1217 21:18:50.084600  588228 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33413 <nil> <nil>}
	I1217 21:18:50.084621  588228 main.go:143] libmachine: About to run SSH command:
	hostname
	I1217 21:18:50.086622  588228 main.go:143] libmachine: Error dialing TCP: ssh: handshake failed: read tcp 127.0.0.1:58366->127.0.0.1:33413: read: connection reset by peer
	I1217 21:18:53.219824  588228 main.go:143] libmachine: SSH cmd err, output: <nil>: kubernetes-upgrade-332113
	
	I1217 21:18:53.219851  588228 ubuntu.go:182] provisioning hostname "kubernetes-upgrade-332113"
	I1217 21:18:53.219935  588228 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-332113
	I1217 21:18:53.237470  588228 main.go:143] libmachine: Using SSH client type: native
	I1217 21:18:53.237781  588228 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33413 <nil> <nil>}
	I1217 21:18:53.237798  588228 main.go:143] libmachine: About to run SSH command:
	sudo hostname kubernetes-upgrade-332113 && echo "kubernetes-upgrade-332113" | sudo tee /etc/hostname
	I1217 21:18:53.378244  588228 main.go:143] libmachine: SSH cmd err, output: <nil>: kubernetes-upgrade-332113
	
	I1217 21:18:53.378332  588228 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-332113
	I1217 21:18:53.397234  588228 main.go:143] libmachine: Using SSH client type: native
	I1217 21:18:53.397551  588228 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33413 <nil> <nil>}
	I1217 21:18:53.397568  588228 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\skubernetes-upgrade-332113' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 kubernetes-upgrade-332113/g' /etc/hosts;
				else 
					echo '127.0.1.1 kubernetes-upgrade-332113' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1217 21:18:53.528685  588228 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1217 21:18:53.528711  588228 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21808-367595/.minikube CaCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21808-367595/.minikube}
	I1217 21:18:53.528787  588228 ubuntu.go:190] setting up certificates
	I1217 21:18:53.528797  588228 provision.go:84] configureAuth start
	I1217 21:18:53.528857  588228 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" kubernetes-upgrade-332113
	I1217 21:18:53.547311  588228 provision.go:143] copyHostCerts
	I1217 21:18:53.547389  588228 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem, removing ...
	I1217 21:18:53.547398  588228 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem
	I1217 21:18:53.547475  588228 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem (1082 bytes)
	I1217 21:18:53.547579  588228 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem, removing ...
	I1217 21:18:53.547585  588228 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem
	I1217 21:18:53.547611  588228 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem (1123 bytes)
	I1217 21:18:53.547706  588228 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem, removing ...
	I1217 21:18:53.547710  588228 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem
	I1217 21:18:53.547733  588228 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem (1679 bytes)
	I1217 21:18:53.547777  588228 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem org=jenkins.kubernetes-upgrade-332113 san=[127.0.0.1 192.168.76.2 kubernetes-upgrade-332113 localhost minikube]
	I1217 21:18:53.622128  588228 provision.go:177] copyRemoteCerts
	I1217 21:18:53.622205  588228 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1217 21:18:53.622252  588228 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-332113
	I1217 21:18:53.642345  588228 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33413 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/kubernetes-upgrade-332113/id_rsa Username:docker}
	I1217 21:18:53.740862  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1217 21:18:53.760444  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem --> /etc/docker/server.pem (1241 bytes)
	I1217 21:18:53.779028  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1217 21:18:53.798557  588228 provision.go:87] duration metric: took 269.747385ms to configureAuth
	I1217 21:18:53.798595  588228 ubuntu.go:206] setting minikube options for container-runtime
	I1217 21:18:53.798811  588228 config.go:182] Loaded profile config "kubernetes-upgrade-332113": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 21:18:53.798819  588228 machine.go:97] duration metric: took 3.742587294s to provisionDockerMachine
	I1217 21:18:53.798828  588228 start.go:293] postStartSetup for "kubernetes-upgrade-332113" (driver="docker")
	I1217 21:18:53.798840  588228 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1217 21:18:53.798895  588228 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1217 21:18:53.798933  588228 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-332113
	I1217 21:18:53.817267  588228 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33413 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/kubernetes-upgrade-332113/id_rsa Username:docker}
	I1217 21:18:53.914439  588228 ssh_runner.go:195] Run: cat /etc/os-release
	I1217 21:18:53.918397  588228 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1217 21:18:53.918431  588228 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1217 21:18:53.918444  588228 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/addons for local assets ...
	I1217 21:18:53.918544  588228 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/files for local assets ...
	I1217 21:18:53.918672  588228 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> 3694612.pem in /etc/ssl/certs
	I1217 21:18:53.918819  588228 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I1217 21:18:53.926590  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 21:18:53.944995  588228 start.go:296] duration metric: took 146.151115ms for postStartSetup
	I1217 21:18:53.945108  588228 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1217 21:18:53.945171  588228 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-332113
	I1217 21:18:53.963355  588228 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33413 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/kubernetes-upgrade-332113/id_rsa Username:docker}
	I1217 21:18:54.061674  588228 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1217 21:18:54.066722  588228 fix.go:56] duration metric: took 4.341089231s for fixHost
	I1217 21:18:54.066747  588228 start.go:83] releasing machines lock for "kubernetes-upgrade-332113", held for 4.341142975s
	I1217 21:18:54.066820  588228 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" kubernetes-upgrade-332113
	I1217 21:18:54.083989  588228 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 21:18:54.084051  588228 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 21:18:54.084062  588228 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 21:18:54.084096  588228 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 21:18:54.084126  588228 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 21:18:54.084156  588228 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 21:18:54.084206  588228 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 21:18:54.084461  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 21:18:54.084536  588228 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-332113
	I1217 21:18:54.102369  588228 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33413 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/kubernetes-upgrade-332113/id_rsa Username:docker}
	I1217 21:18:54.214820  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 21:18:54.234559  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 21:18:54.254545  588228 ssh_runner.go:195] Run: openssl version
	I1217 21:18:54.261447  588228 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 21:18:54.270800  588228 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 21:18:54.280161  588228 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 21:18:54.284241  588228 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 21:18:54.284339  588228 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 21:18:54.326324  588228 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 21:18:54.333883  588228 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 21:18:54.341398  588228 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 21:18:54.350015  588228 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 21:18:54.353989  588228 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 21:18:54.354104  588228 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 21:18:54.395797  588228 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 21:18:54.403519  588228 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 21:18:54.411013  588228 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 21:18:54.418739  588228 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 21:18:54.422668  588228 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 21:18:54.422810  588228 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 21:18:54.464473  588228 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 21:18:54.473809  588228 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-certificates >/dev/null 2>&1 && sudo update-ca-certificates || true"
	I1217 21:18:54.477693  588228 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-trust >/dev/null 2>&1 && sudo update-ca-trust extract || true"
	I1217 21:18:54.481426  588228 ssh_runner.go:195] Run: cat /version.json
	I1217 21:18:54.481564  588228 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1217 21:18:54.486197  588228 ssh_runner.go:195] Run: systemctl --version
	I1217 21:18:54.581579  588228 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1217 21:18:54.586657  588228 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1217 21:18:54.586742  588228 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1217 21:18:54.594663  588228 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1217 21:18:54.594687  588228 start.go:496] detecting cgroup driver to use...
	I1217 21:18:54.594747  588228 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1217 21:18:54.594811  588228 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1217 21:18:54.613781  588228 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1217 21:18:54.627777  588228 docker.go:218] disabling cri-docker service (if available) ...
	I1217 21:18:54.627872  588228 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1217 21:18:54.643811  588228 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1217 21:18:54.657051  588228 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1217 21:18:54.784476  588228 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1217 21:18:54.906509  588228 docker.go:234] disabling docker service ...
	I1217 21:18:54.906614  588228 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1217 21:18:54.921634  588228 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1217 21:18:54.934394  588228 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1217 21:18:55.044568  588228 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1217 21:18:55.166905  588228 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1217 21:18:55.180394  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1217 21:18:55.202452  588228 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1217 21:18:55.213147  588228 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1217 21:18:55.223121  588228 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1217 21:18:55.223201  588228 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1217 21:18:55.232996  588228 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 21:18:55.243127  588228 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1217 21:18:55.252314  588228 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 21:18:55.262255  588228 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1217 21:18:55.270458  588228 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1217 21:18:55.279830  588228 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1217 21:18:55.288868  588228 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1217 21:18:55.298288  588228 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1217 21:18:55.306187  588228 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1217 21:18:55.313865  588228 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 21:18:55.424496  588228 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1217 21:18:55.585885  588228 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1217 21:18:55.585969  588228 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1217 21:18:55.590272  588228 start.go:564] Will wait 60s for crictl version
	I1217 21:18:55.590366  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:18:55.594418  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1217 21:18:55.622449  588228 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1217 21:18:55.622557  588228 ssh_runner.go:195] Run: containerd --version
	I1217 21:18:55.654037  588228 ssh_runner.go:195] Run: containerd --version
	I1217 21:18:55.680508  588228 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1217 21:18:55.683397  588228 cli_runner.go:164] Run: docker network inspect kubernetes-upgrade-332113 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1217 21:18:55.700575  588228 ssh_runner.go:195] Run: grep 192.168.76.1	host.minikube.internal$ /etc/hosts
	I1217 21:18:55.704485  588228 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.76.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1217 21:18:55.715145  588228 kubeadm.go:884] updating cluster {Name:kubernetes-upgrade-332113 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:kubernetes-upgrade-332113 Namespace:default APIServerHAVIP: APIServ
erName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQem
uFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1217 21:18:55.715263  588228 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 21:18:55.715323  588228 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 21:18:55.741799  588228 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.35.0-rc.1". assuming images are not preloaded.
	I1217 21:18:55.741873  588228 ssh_runner.go:195] Run: which lz4
	I1217 21:18:55.745828  588228 ssh_runner.go:195] Run: stat -c "%s %y" /preloaded.tar.lz4
	I1217 21:18:55.749776  588228 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%s %y" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I1217 21:18:55.749815  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 --> /preloaded.tar.lz4 (305659384 bytes)
	I1217 21:18:58.884578  588228 containerd.go:563] duration metric: took 3.138789756s to copy over tarball
	I1217 21:18:58.884666  588228 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I1217 21:19:01.281431  588228 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.39672142s)
	I1217 21:19:01.281577  588228 kubeadm.go:910] preload failed, will try to load cached images: extracting tarball: 
	** stderr ** 
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Europe: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Brazil: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Canada: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Antarctica: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Chile: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Etc: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Pacific: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Mexico: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Australia: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/US: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Asia: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Atlantic: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/America: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Arctic: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Africa: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Indian: Cannot open: File exists
	tar: Exiting with failure status due to previous errors
	
	** /stderr **: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: Process exited with status 2
	stdout:
	
	stderr:
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Europe: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Brazil: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Canada: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Antarctica: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Chile: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Etc: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Pacific: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Mexico: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Australia: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/US: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Asia: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Atlantic: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/America: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Arctic: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Africa: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Indian: Cannot open: File exists
	tar: Exiting with failure status due to previous errors
	I1217 21:19:01.281665  588228 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 21:19:01.317670  588228 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.35.0-rc.1". assuming images are not preloaded.
	I1217 21:19:01.317697  588228 cache_images.go:90] LoadCachedImages start: [registry.k8s.io/kube-apiserver:v1.35.0-rc.1 registry.k8s.io/kube-controller-manager:v1.35.0-rc.1 registry.k8s.io/kube-scheduler:v1.35.0-rc.1 registry.k8s.io/kube-proxy:v1.35.0-rc.1 registry.k8s.io/pause:3.10.1 registry.k8s.io/etcd:3.6.6-0 registry.k8s.io/coredns/coredns:v1.13.1 gcr.io/k8s-minikube/storage-provisioner:v5]
	I1217 21:19:01.317762  588228 image.go:138] retrieving image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1217 21:19:01.317975  588228 image.go:138] retrieving image: registry.k8s.io/kube-apiserver:v1.35.0-rc.1
	I1217 21:19:01.318079  588228 image.go:138] retrieving image: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
	I1217 21:19:01.318172  588228 image.go:138] retrieving image: registry.k8s.io/kube-scheduler:v1.35.0-rc.1
	I1217 21:19:01.318265  588228 image.go:138] retrieving image: registry.k8s.io/kube-proxy:v1.35.0-rc.1
	I1217 21:19:01.318357  588228 image.go:138] retrieving image: registry.k8s.io/pause:3.10.1
	I1217 21:19:01.318461  588228 image.go:138] retrieving image: registry.k8s.io/etcd:3.6.6-0
	I1217 21:19:01.318550  588228 image.go:138] retrieving image: registry.k8s.io/coredns/coredns:v1.13.1
	I1217 21:19:01.319651  588228 image.go:181] daemon lookup for registry.k8s.io/etcd:3.6.6-0: Error response from daemon: No such image: registry.k8s.io/etcd:3.6.6-0
	I1217 21:19:01.320205  588228 image.go:181] daemon lookup for registry.k8s.io/kube-controller-manager:v1.35.0-rc.1: Error response from daemon: No such image: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
	I1217 21:19:01.320458  588228 image.go:181] daemon lookup for registry.k8s.io/kube-apiserver:v1.35.0-rc.1: Error response from daemon: No such image: registry.k8s.io/kube-apiserver:v1.35.0-rc.1
	I1217 21:19:01.320639  588228 image.go:181] daemon lookup for registry.k8s.io/coredns/coredns:v1.13.1: Error response from daemon: No such image: registry.k8s.io/coredns/coredns:v1.13.1
	I1217 21:19:01.320854  588228 image.go:181] daemon lookup for gcr.io/k8s-minikube/storage-provisioner:v5: Error response from daemon: No such image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1217 21:19:01.321118  588228 image.go:181] daemon lookup for registry.k8s.io/pause:3.10.1: Error response from daemon: No such image: registry.k8s.io/pause:3.10.1
	I1217 21:19:01.321346  588228 image.go:181] daemon lookup for registry.k8s.io/kube-proxy:v1.35.0-rc.1: Error response from daemon: No such image: registry.k8s.io/kube-proxy:v1.35.0-rc.1
	I1217 21:19:01.321517  588228 image.go:181] daemon lookup for registry.k8s.io/kube-scheduler:v1.35.0-rc.1: Error response from daemon: No such image: registry.k8s.io/kube-scheduler:v1.35.0-rc.1
	I1217 21:19:01.626118  588228 containerd.go:267] Checking existence of image with name "registry.k8s.io/etcd:3.6.6-0" and sha "271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57"
	I1217 21:19:01.626216  588228 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/etcd:3.6.6-0
	I1217 21:19:01.634699  588228 containerd.go:267] Checking existence of image with name "registry.k8s.io/coredns/coredns:v1.13.1" and sha "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf"
	I1217 21:19:01.634814  588228 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/coredns/coredns:v1.13.1
	I1217 21:19:01.653727  588228 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-scheduler:v1.35.0-rc.1" and sha "abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde"
	I1217 21:19:01.653854  588228 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-scheduler:v1.35.0-rc.1
	I1217 21:19:01.656453  588228 containerd.go:267] Checking existence of image with name "registry.k8s.io/pause:3.10.1" and sha "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd"
	I1217 21:19:01.656521  588228 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/pause:3.10.1
	I1217 21:19:01.666590  588228 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" and sha "a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a"
	I1217 21:19:01.666667  588228 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
	I1217 21:19:01.668313  588228 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-proxy:v1.35.0-rc.1" and sha "7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e"
	I1217 21:19:01.668382  588228 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-proxy:v1.35.0-rc.1
	I1217 21:19:01.717251  588228 cache_images.go:118] "registry.k8s.io/coredns/coredns:v1.13.1" needs transfer: "registry.k8s.io/coredns/coredns:v1.13.1" does not exist at hash "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf" in container runtime
	I1217 21:19:01.717294  588228 cri.go:218] Removing image: registry.k8s.io/coredns/coredns:v1.13.1
	I1217 21:19:01.717355  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:19:01.717473  588228 cache_images.go:118] "registry.k8s.io/kube-scheduler:v1.35.0-rc.1" needs transfer: "registry.k8s.io/kube-scheduler:v1.35.0-rc.1" does not exist at hash "abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde" in container runtime
	I1217 21:19:01.717501  588228 cri.go:218] Removing image: registry.k8s.io/kube-scheduler:v1.35.0-rc.1
	I1217 21:19:01.717532  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:19:01.717652  588228 cache_images.go:118] "registry.k8s.io/etcd:3.6.6-0" needs transfer: "registry.k8s.io/etcd:3.6.6-0" does not exist at hash "271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57" in container runtime
	I1217 21:19:01.717672  588228 cri.go:218] Removing image: registry.k8s.io/etcd:3.6.6-0
	I1217 21:19:01.717724  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:19:01.725058  588228 cache_images.go:118] "registry.k8s.io/pause:3.10.1" needs transfer: "registry.k8s.io/pause:3.10.1" does not exist at hash "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd" in container runtime
	I1217 21:19:01.725096  588228 cri.go:218] Removing image: registry.k8s.io/pause:3.10.1
	I1217 21:19:01.725149  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:19:01.725247  588228 cache_images.go:118] "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" needs transfer: "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" does not exist at hash "a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a" in container runtime
	I1217 21:19:01.725267  588228 cri.go:218] Removing image: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
	I1217 21:19:01.725303  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:19:01.733977  588228 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-apiserver:v1.35.0-rc.1" and sha "3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54"
	I1217 21:19:01.734060  588228 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-apiserver:v1.35.0-rc.1
	I1217 21:19:01.741038  588228 cache_images.go:118] "registry.k8s.io/kube-proxy:v1.35.0-rc.1" needs transfer: "registry.k8s.io/kube-proxy:v1.35.0-rc.1" does not exist at hash "7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e" in container runtime
	I1217 21:19:01.741084  588228 cri.go:218] Removing image: registry.k8s.io/kube-proxy:v1.35.0-rc.1
	I1217 21:19:01.741134  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:19:01.741242  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1217 21:19:01.741319  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-rc.1
	I1217 21:19:01.741385  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.6.6-0
	I1217 21:19:01.741452  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1217 21:19:01.741534  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
	I1217 21:19:01.786305  588228 cache_images.go:118] "registry.k8s.io/kube-apiserver:v1.35.0-rc.1" needs transfer: "registry.k8s.io/kube-apiserver:v1.35.0-rc.1" does not exist at hash "3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54" in container runtime
	I1217 21:19:01.786347  588228 cri.go:218] Removing image: registry.k8s.io/kube-apiserver:v1.35.0-rc.1
	I1217 21:19:01.786396  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:19:01.845080  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1217 21:19:01.845151  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-rc.1
	I1217 21:19:01.845210  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
	I1217 21:19:01.845260  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-rc.1
	I1217 21:19:01.845305  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.6.6-0
	I1217 21:19:01.845386  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1217 21:19:01.845397  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-rc.1
	I1217 21:19:01.952428  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-rc.1
	I1217 21:19:01.952560  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-rc.1
	I1217 21:19:01.952563  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1217 21:19:01.952677  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
	I1217 21:19:01.952726  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-rc.1
	I1217 21:19:01.952803  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.6.6-0
	I1217 21:19:01.952884  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1217 21:19:02.061774  588228 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1
	I1217 21:19:02.061928  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-rc.1
	I1217 21:19:02.062028  588228 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/pause_3.10.1
	I1217 21:19:02.062079  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-rc.1
	I1217 21:19:02.062110  588228 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-rc.1
	I1217 21:19:02.062083  588228 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1
	I1217 21:19:02.062202  588228 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.6-0
	I1217 21:19:02.062180  588228 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-rc.1
	I1217 21:19:02.106572  588228 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-rc.1
	I1217 21:19:02.106674  588228 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-rc.1
	I1217 21:19:02.106738  588228 ssh_runner.go:352] existence check for /var/lib/minikube/images/pause_3.10.1: stat -c "%s %y" /var/lib/minikube/images/pause_3.10.1: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/pause_3.10.1': No such file or directory
	I1217 21:19:02.106760  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 --> /var/lib/minikube/images/pause_3.10.1 (268288 bytes)
	I1217 21:19:02.139813  588228 containerd.go:285] Loading image: /var/lib/minikube/images/pause_3.10.1
	I1217 21:19:02.139940  588228 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/pause_3.10.1
	W1217 21:19:02.508883  588228 image.go:328] image gcr.io/k8s-minikube/storage-provisioner:v5 arch mismatch: want arm64 got amd64. fixing
	I1217 21:19:02.509085  588228 containerd.go:267] Checking existence of image with name "gcr.io/k8s-minikube/storage-provisioner:v5" and sha "66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51"
	I1217 21:19:02.509162  588228 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==gcr.io/k8s-minikube/storage-provisioner:v5
	I1217 21:19:02.532631  588228 cache_images.go:118] "gcr.io/k8s-minikube/storage-provisioner:v5" needs transfer: "gcr.io/k8s-minikube/storage-provisioner:v5" does not exist at hash "66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51" in container runtime
	I1217 21:19:02.532672  588228 cri.go:218] Removing image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1217 21:19:02.532730  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:19:02.536459  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5
	I1217 21:19:02.650204  588228 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5
	I1217 21:19:02.650353  588228 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5
	I1217 21:19:02.654062  588228 ssh_runner.go:352] existence check for /var/lib/minikube/images/storage-provisioner_v5: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/storage-provisioner_v5': No such file or directory
	I1217 21:19:02.654098  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 --> /var/lib/minikube/images/storage-provisioner_v5 (8035840 bytes)
	I1217 21:19:02.742582  588228 containerd.go:285] Loading image: /var/lib/minikube/images/storage-provisioner_v5
	I1217 21:19:02.742699  588228 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/storage-provisioner_v5
	I1217 21:19:03.136427  588228 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21808-367595/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 from cache
	I1217 21:19:03.136491  588228 cache_images.go:94] duration metric: took 1.818780349s to LoadCachedImages
	W1217 21:19:03.136581  588228 out.go:285] X Unable to load cached images: LoadCachedImages: stat /home/jenkins/minikube-integration/21808-367595/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-rc.1: no such file or directory
	X Unable to load cached images: LoadCachedImages: stat /home/jenkins/minikube-integration/21808-367595/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-rc.1: no such file or directory
	I1217 21:19:03.136600  588228 kubeadm.go:935] updating node { 192.168.76.2 8443 v1.35.0-rc.1 containerd true true} ...
	I1217 21:19:03.136840  588228 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=kubernetes-upgrade-332113 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.76.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:kubernetes-upgrade-332113 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1217 21:19:03.136919  588228 ssh_runner.go:195] Run: sudo crictl info
	I1217 21:19:03.164878  588228 cni.go:84] Creating CNI manager for ""
	I1217 21:19:03.164906  588228 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 21:19:03.164919  588228 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1217 21:19:03.164943  588228 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.76.2 APIServerPort:8443 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:kubernetes-upgrade-332113 NodeName:kubernetes-upgrade-332113 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.76.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.76.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/ce
rts/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1217 21:19:03.165059  588228 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.76.2
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "kubernetes-upgrade-332113"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.76.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.76.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1217 21:19:03.165168  588228 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1217 21:19:03.173272  588228 binaries.go:51] Found k8s binaries, skipping transfer
	I1217 21:19:03.173364  588228 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1217 21:19:03.182957  588228 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (334 bytes)
	I1217 21:19:03.199688  588228 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1217 21:19:03.214430  588228 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2243 bytes)
	I1217 21:19:03.227460  588228 ssh_runner.go:195] Run: grep 192.168.76.2	control-plane.minikube.internal$ /etc/hosts
	I1217 21:19:03.231918  588228 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.76.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1217 21:19:03.243914  588228 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 21:19:03.367166  588228 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1217 21:19:03.384210  588228 certs.go:69] Setting up /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/kubernetes-upgrade-332113 for IP: 192.168.76.2
	I1217 21:19:03.384230  588228 certs.go:195] generating shared ca certs ...
	I1217 21:19:03.384281  588228 certs.go:227] acquiring lock for ca certs: {Name:mk528c7ee25f2f3d78de33f266a77f908cb2a9d0 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 21:19:03.384434  588228 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key
	I1217 21:19:03.384495  588228 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key
	I1217 21:19:03.384509  588228 certs.go:257] generating profile certs ...
	I1217 21:19:03.384616  588228 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/kubernetes-upgrade-332113/client.key
	I1217 21:19:03.384689  588228 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/kubernetes-upgrade-332113/apiserver.key.1e12bc19
	I1217 21:19:03.384752  588228 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/kubernetes-upgrade-332113/proxy-client.key
	I1217 21:19:03.384864  588228 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 21:19:03.384905  588228 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 21:19:03.384917  588228 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 21:19:03.384946  588228 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 21:19:03.384979  588228 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 21:19:03.385008  588228 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 21:19:03.385063  588228 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 21:19:03.385638  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1217 21:19:03.410317  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1217 21:19:03.435809  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1217 21:19:03.458437  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1217 21:19:03.478690  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/kubernetes-upgrade-332113/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I1217 21:19:03.498406  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/kubernetes-upgrade-332113/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1217 21:19:03.518986  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/kubernetes-upgrade-332113/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1217 21:19:03.539332  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/kubernetes-upgrade-332113/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1217 21:19:03.558161  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 21:19:03.579607  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 21:19:03.598830  588228 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 21:19:03.618934  588228 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1217 21:19:03.632570  588228 ssh_runner.go:195] Run: openssl version
	I1217 21:19:03.639791  588228 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 21:19:03.647388  588228 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 21:19:03.655493  588228 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 21:19:03.659468  588228 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 21:19:03.659537  588228 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 21:19:03.701098  588228 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 21:19:03.709845  588228 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 21:19:03.717244  588228 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 21:19:03.724570  588228 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 21:19:03.728839  588228 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 21:19:03.728946  588228 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 21:19:03.770373  588228 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 21:19:03.777783  588228 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 21:19:03.785671  588228 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 21:19:03.793395  588228 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 21:19:03.797141  588228 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 21:19:03.797208  588228 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 21:19:03.838875  588228 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 21:19:03.846780  588228 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1217 21:19:03.850926  588228 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1217 21:19:03.892069  588228 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1217 21:19:03.935227  588228 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1217 21:19:03.976300  588228 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1217 21:19:04.018501  588228 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1217 21:19:04.068371  588228 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1217 21:19:04.110835  588228 kubeadm.go:401] StartCluster: {Name:kubernetes-upgrade-332113 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:kubernetes-upgrade-332113 Namespace:default APIServerHAVIP: APIServerN
ame:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFi
rmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 21:19:04.110930  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1217 21:19:04.111003  588228 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1217 21:19:04.139178  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:19:04.139201  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:19:04.139206  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:19:04.139209  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:19:04.139213  588228 cri.go:89] found id: ""
	I1217 21:19:04.139270  588228 ssh_runner.go:195] Run: sudo runc --root /run/containerd/runc/k8s.io list -f json
	W1217 21:19:04.156067  588228 kubeadm.go:408] unpause failed: list paused: runc: sudo runc --root /run/containerd/runc/k8s.io list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-17T21:19:04Z" level=error msg="open /run/containerd/runc/k8s.io: no such file or directory"
	I1217 21:19:04.156141  588228 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1217 21:19:04.164468  588228 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1217 21:19:04.164499  588228 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1217 21:19:04.164551  588228 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1217 21:19:04.172297  588228 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1217 21:19:04.172973  588228 kubeconfig.go:47] verify endpoint returned: get endpoint: "kubernetes-upgrade-332113" does not appear in /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 21:19:04.173224  588228 kubeconfig.go:62] /home/jenkins/minikube-integration/21808-367595/kubeconfig needs updating (will repair): [kubeconfig missing "kubernetes-upgrade-332113" cluster setting kubeconfig missing "kubernetes-upgrade-332113" context setting]
	I1217 21:19:04.173688  588228 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/kubeconfig: {Name:mk68b516071fc5d9da0842bf56ff4d318cea3c03 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 21:19:04.174327  588228 kapi.go:59] client config for kubernetes-upgrade-332113: &rest.Config{Host:"https://192.168.76.2:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/kubernetes-upgrade-332113/client.crt", KeyFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/profiles/kubernetes-upgrade-332113/client.key", CAFile:"/home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(ni
l), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1217 21:19:04.174860  588228 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1217 21:19:04.174879  588228 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1217 21:19:04.174885  588228 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1217 21:19:04.174892  588228 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1217 21:19:04.174901  588228 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1217 21:19:04.175156  588228 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1217 21:19:04.185225  588228 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-17 21:18:29.577703026 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-17 21:19:03.221965715 +0000
	@@ -1,4 +1,4 @@
	-apiVersion: kubeadm.k8s.io/v1beta3
	+apiVersion: kubeadm.k8s.io/v1beta4
	 kind: InitConfiguration
	 localAPIEndpoint:
	   advertiseAddress: 192.168.76.2
	@@ -14,31 +14,34 @@
	   criSocket: unix:///run/containerd/containerd.sock
	   name: "kubernetes-upgrade-332113"
	   kubeletExtraArgs:
	-    node-ip: 192.168.76.2
	+    - name: "node-ip"
	+      value: "192.168.76.2"
	   taints: []
	 ---
	-apiVersion: kubeadm.k8s.io/v1beta3
	+apiVersion: kubeadm.k8s.io/v1beta4
	 kind: ClusterConfiguration
	 apiServer:
	   certSANs: ["127.0.0.1", "localhost", "192.168.76.2"]
	   extraArgs:
	-    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+    - name: "enable-admission-plugins"
	+      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	 controllerManager:
	   extraArgs:
	-    allocate-node-cidrs: "true"
	-    leader-elect: "false"
	+    - name: "allocate-node-cidrs"
	+      value: "true"
	+    - name: "leader-elect"
	+      value: "false"
	 scheduler:
	   extraArgs:
	-    leader-elect: "false"
	+    - name: "leader-elect"
	+      value: "false"
	 certificatesDir: /var/lib/minikube/certs
	 clusterName: mk
	 controlPlaneEndpoint: control-plane.minikube.internal:8443
	 etcd:
	   local:
	     dataDir: /var/lib/minikube/etcd
	-    extraArgs:
	-      proxy-refresh-interval: "70000"
	-kubernetesVersion: v1.28.0
	+kubernetesVersion: v1.35.0-rc.1
	 networking:
	   dnsDomain: cluster.local
	   podSubnet: "10.244.0.0/16"
	
	-- /stdout --
	I1217 21:19:04.185291  588228 kubeadm.go:1161] stopping kube-system containers ...
	I1217 21:19:04.185316  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1217 21:19:04.185402  588228 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1217 21:19:04.215665  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:19:04.215684  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:19:04.215689  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:19:04.215692  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:19:04.215696  588228 cri.go:89] found id: ""
	I1217 21:19:04.215701  588228 cri.go:252] Stopping containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:19:04.215754  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:19:04.220122  588228 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl stop --timeout=10 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd
	I1217 21:19:04.266360  588228 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1217 21:19:04.289062  588228 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 21:19:04.298413  588228 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5643 Dec 17 21:18 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5652 Dec 17 21:18 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 2039 Dec 17 21:18 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5600 Dec 17 21:18 /etc/kubernetes/scheduler.conf
	
	I1217 21:19:04.298478  588228 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1217 21:19:04.307166  588228 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1217 21:19:04.315118  588228 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1217 21:19:04.323050  588228 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1217 21:19:04.323162  588228 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 21:19:04.331646  588228 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1217 21:19:04.339378  588228 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1217 21:19:04.339463  588228 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 21:19:04.347176  588228 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1217 21:19:04.355068  588228 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1217 21:19:04.407192  588228 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1217 21:19:05.785694  588228 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.378468998s)
	I1217 21:19:05.785823  588228 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1217 21:19:06.020871  588228 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1217 21:19:06.088790  588228 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1217 21:19:06.135365  588228 api_server.go:52] waiting for apiserver process to appear ...
	I1217 21:19:06.135448  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:06.636327  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:07.136244  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:07.636422  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:08.136543  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:08.635629  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:09.135637  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:09.635894  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:10.136390  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:10.636577  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:11.136314  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:11.636005  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:12.136487  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:12.635636  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:13.136528  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:13.636143  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:14.135832  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:14.636003  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:15.135567  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:15.636174  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:16.135713  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:16.635874  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:17.136244  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:17.636201  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:18.136321  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:18.636154  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:19.136304  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:19.635895  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:20.135854  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:20.635745  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:21.136217  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:21.635616  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:22.135633  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:22.635610  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:23.135543  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:23.635894  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:24.135714  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:24.635692  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:25.135577  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:25.635594  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:26.136311  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:26.636602  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:27.136364  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:27.636080  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:28.135595  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:28.635583  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:29.135601  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:29.636147  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:30.135605  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:30.636151  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:31.136481  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:31.635605  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:32.135600  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:32.635600  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:33.136307  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:33.636529  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:34.135861  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:34.635629  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:35.136419  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:35.636409  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:36.135666  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:36.636447  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:37.136309  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:37.636322  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:38.135600  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:38.635569  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:39.135610  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:39.635563  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:40.136402  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:40.635772  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:41.136487  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:41.636159  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:42.136169  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:42.636489  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:43.135640  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:43.635617  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:44.135612  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:44.635699  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:45.136589  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:45.635572  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:46.135877  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:46.636343  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:47.136088  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:47.636497  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:48.135604  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:48.635609  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:49.135618  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:49.636159  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:50.135567  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:50.635589  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:51.136357  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:51.635604  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:52.136160  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:52.635588  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:53.135598  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:53.636333  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:54.136394  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:54.636495  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:55.135606  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:55.636292  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:56.135611  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:56.636380  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:57.135648  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:57.635615  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:58.136560  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:58.636264  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:59.135615  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:19:59.636564  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:00.136395  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:00.635708  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:01.135650  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:01.635923  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:02.135584  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:02.636411  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:03.136363  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:03.636417  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:04.136490  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:04.635731  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:05.136440  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:05.636384  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:06.137599  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:20:06.137761  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:20:06.178794  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:06.178872  588228 cri.go:89] found id: ""
	I1217 21:20:06.178894  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:20:06.178986  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:06.189120  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:20:06.189216  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:20:06.219635  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:06.219659  588228 cri.go:89] found id: ""
	I1217 21:20:06.219667  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:20:06.219757  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:06.224269  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:20:06.224356  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:20:06.265537  588228 cri.go:89] found id: ""
	I1217 21:20:06.265573  588228 logs.go:282] 0 containers: []
	W1217 21:20:06.265583  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:20:06.265590  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:20:06.265691  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:20:06.309799  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:06.309822  588228 cri.go:89] found id: ""
	I1217 21:20:06.309830  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:20:06.309913  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:06.317158  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:20:06.317268  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:20:06.362796  588228 cri.go:89] found id: ""
	I1217 21:20:06.362820  588228 logs.go:282] 0 containers: []
	W1217 21:20:06.362830  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:20:06.362836  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:20:06.362937  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:20:06.411743  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:06.411825  588228 cri.go:89] found id: ""
	I1217 21:20:06.411848  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:20:06.411937  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:06.416657  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:20:06.416790  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:20:06.468329  588228 cri.go:89] found id: ""
	I1217 21:20:06.468410  588228 logs.go:282] 0 containers: []
	W1217 21:20:06.468433  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:20:06.468450  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:20:06.468564  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:20:06.553588  588228 cri.go:89] found id: ""
	I1217 21:20:06.553609  588228 logs.go:282] 0 containers: []
	W1217 21:20:06.553618  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:20:06.553633  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:20:06.553648  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:20:06.684962  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:20:06.685071  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:06.759878  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:20:06.759956  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:06.800130  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:20:06.800312  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:20:06.839810  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:20:06.839841  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:20:06.864222  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:20:06.864259  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:20:06.962861  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:20:06.962923  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:20:06.962951  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:07.034690  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:20:07.034761  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:07.089456  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:20:07.089665  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:20:09.661733  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:09.674897  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:20:09.674970  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:20:09.707178  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:09.707201  588228 cri.go:89] found id: ""
	I1217 21:20:09.707209  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:20:09.707269  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:09.711121  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:20:09.711196  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:20:09.749582  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:09.749603  588228 cri.go:89] found id: ""
	I1217 21:20:09.749612  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:20:09.749670  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:09.754155  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:20:09.754227  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:20:09.797129  588228 cri.go:89] found id: ""
	I1217 21:20:09.797152  588228 logs.go:282] 0 containers: []
	W1217 21:20:09.797162  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:20:09.797168  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:20:09.797232  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:20:09.826741  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:09.826763  588228 cri.go:89] found id: ""
	I1217 21:20:09.826772  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:20:09.826832  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:09.830988  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:20:09.831069  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:20:09.861504  588228 cri.go:89] found id: ""
	I1217 21:20:09.861528  588228 logs.go:282] 0 containers: []
	W1217 21:20:09.861538  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:20:09.861544  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:20:09.861603  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:20:09.886715  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:09.886788  588228 cri.go:89] found id: ""
	I1217 21:20:09.886810  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:20:09.886878  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:09.890822  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:20:09.890935  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:20:09.916573  588228 cri.go:89] found id: ""
	I1217 21:20:09.916606  588228 logs.go:282] 0 containers: []
	W1217 21:20:09.916616  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:20:09.916622  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:20:09.916681  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:20:09.942394  588228 cri.go:89] found id: ""
	I1217 21:20:09.942418  588228 logs.go:282] 0 containers: []
	W1217 21:20:09.942426  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:20:09.942441  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:20:09.942453  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:09.986660  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:20:09.986698  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:10.036485  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:20:10.036528  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:10.076190  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:20:10.076224  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:20:10.117301  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:20:10.117339  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:20:10.153641  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:20:10.153671  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:20:10.222885  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:20:10.222924  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:20:10.240948  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:20:10.240977  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:20:10.360038  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:20:10.360062  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:20:10.360074  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:12.920381  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:12.955230  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:20:12.955306  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:20:13.001276  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:13.001297  588228 cri.go:89] found id: ""
	I1217 21:20:13.001305  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:20:13.001362  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:13.007116  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:20:13.007202  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:20:13.039996  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:13.040072  588228 cri.go:89] found id: ""
	I1217 21:20:13.040094  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:20:13.040178  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:13.044146  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:20:13.044221  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:20:13.118939  588228 cri.go:89] found id: ""
	I1217 21:20:13.118963  588228 logs.go:282] 0 containers: []
	W1217 21:20:13.118972  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:20:13.118978  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:20:13.119042  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:20:13.148397  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:13.148421  588228 cri.go:89] found id: ""
	I1217 21:20:13.148429  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:20:13.148495  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:13.152554  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:20:13.152646  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:20:13.183829  588228 cri.go:89] found id: ""
	I1217 21:20:13.183853  588228 logs.go:282] 0 containers: []
	W1217 21:20:13.183862  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:20:13.183869  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:20:13.183930  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:20:13.220588  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:13.220613  588228 cri.go:89] found id: ""
	I1217 21:20:13.220624  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:20:13.220689  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:13.224674  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:20:13.224749  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:20:13.256466  588228 cri.go:89] found id: ""
	I1217 21:20:13.256562  588228 logs.go:282] 0 containers: []
	W1217 21:20:13.256589  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:20:13.256607  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:20:13.256683  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:20:13.282898  588228 cri.go:89] found id: ""
	I1217 21:20:13.282920  588228 logs.go:282] 0 containers: []
	W1217 21:20:13.282929  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:20:13.282942  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:20:13.282955  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:20:13.341783  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:20:13.341821  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:20:13.357099  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:20:13.357128  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:20:13.432517  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:20:13.432539  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:20:13.432551  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:13.479975  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:20:13.480009  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:13.521370  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:20:13.521413  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:13.616410  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:20:13.616434  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:20:13.650468  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:20:13.650499  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:13.701229  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:20:13.701262  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:20:16.265405  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:16.275679  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:20:16.275749  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:20:16.302098  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:16.302121  588228 cri.go:89] found id: ""
	I1217 21:20:16.302130  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:20:16.302189  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:16.306250  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:20:16.306329  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:20:16.335807  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:16.335831  588228 cri.go:89] found id: ""
	I1217 21:20:16.335839  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:20:16.335893  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:16.339727  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:20:16.339802  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:20:16.365823  588228 cri.go:89] found id: ""
	I1217 21:20:16.365846  588228 logs.go:282] 0 containers: []
	W1217 21:20:16.365854  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:20:16.365861  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:20:16.365918  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:20:16.391741  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:16.391769  588228 cri.go:89] found id: ""
	I1217 21:20:16.391782  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:20:16.391840  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:16.395609  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:20:16.395687  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:20:16.420494  588228 cri.go:89] found id: ""
	I1217 21:20:16.420519  588228 logs.go:282] 0 containers: []
	W1217 21:20:16.420527  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:20:16.420534  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:20:16.420594  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:20:16.447320  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:16.447394  588228 cri.go:89] found id: ""
	I1217 21:20:16.447419  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:20:16.447489  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:16.451384  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:20:16.451459  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:20:16.478793  588228 cri.go:89] found id: ""
	I1217 21:20:16.478819  588228 logs.go:282] 0 containers: []
	W1217 21:20:16.478828  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:20:16.478835  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:20:16.478904  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:20:16.514414  588228 cri.go:89] found id: ""
	I1217 21:20:16.514440  588228 logs.go:282] 0 containers: []
	W1217 21:20:16.514449  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:20:16.514465  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:20:16.514478  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:20:16.546820  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:20:16.546855  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:20:16.582772  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:20:16.582802  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:20:16.640511  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:20:16.640553  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:20:16.702955  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:20:16.702976  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:20:16.702990  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:16.738045  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:20:16.738077  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:20:16.753529  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:20:16.753556  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:16.789210  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:20:16.789241  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:16.834859  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:20:16.834892  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:19.376563  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:19.388939  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:20:19.389010  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:20:19.423176  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:19.423202  588228 cri.go:89] found id: ""
	I1217 21:20:19.423210  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:20:19.423266  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:19.427862  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:20:19.427926  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:20:19.457856  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:19.457883  588228 cri.go:89] found id: ""
	I1217 21:20:19.457891  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:20:19.457948  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:19.463470  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:20:19.463549  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:20:19.511032  588228 cri.go:89] found id: ""
	I1217 21:20:19.511060  588228 logs.go:282] 0 containers: []
	W1217 21:20:19.511069  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:20:19.511076  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:20:19.511137  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:20:19.569449  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:19.569473  588228 cri.go:89] found id: ""
	I1217 21:20:19.569482  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:20:19.569538  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:19.581768  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:20:19.581845  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:20:19.619495  588228 cri.go:89] found id: ""
	I1217 21:20:19.619523  588228 logs.go:282] 0 containers: []
	W1217 21:20:19.619532  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:20:19.619539  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:20:19.619596  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:20:19.662517  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:19.662549  588228 cri.go:89] found id: ""
	I1217 21:20:19.662556  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:20:19.662619  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:19.667339  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:20:19.667429  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:20:19.710974  588228 cri.go:89] found id: ""
	I1217 21:20:19.711014  588228 logs.go:282] 0 containers: []
	W1217 21:20:19.711024  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:20:19.711031  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:20:19.711109  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:20:19.744213  588228 cri.go:89] found id: ""
	I1217 21:20:19.744243  588228 logs.go:282] 0 containers: []
	W1217 21:20:19.744266  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:20:19.744279  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:20:19.744292  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:19.790450  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:20:19.790491  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:20:19.837374  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:20:19.837407  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:20:19.922270  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:20:19.922307  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:19.969178  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:20:19.969223  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:20.017494  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:20:20.017554  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:20:20.054919  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:20:20.054984  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:20:20.074528  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:20:20.074552  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:20:20.169444  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:20:20.169461  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:20:20.169474  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:22.715092  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:22.725384  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:20:22.725450  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:20:22.754235  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:22.754252  588228 cri.go:89] found id: ""
	I1217 21:20:22.754262  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:20:22.754317  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:22.758531  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:20:22.758601  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:20:22.786132  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:22.786151  588228 cri.go:89] found id: ""
	I1217 21:20:22.786159  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:20:22.786211  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:22.790532  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:20:22.790653  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:20:22.822532  588228 cri.go:89] found id: ""
	I1217 21:20:22.822552  588228 logs.go:282] 0 containers: []
	W1217 21:20:22.822560  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:20:22.822566  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:20:22.822621  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:20:22.857972  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:22.857992  588228 cri.go:89] found id: ""
	I1217 21:20:22.858001  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:20:22.858057  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:22.862231  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:20:22.862298  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:20:22.898543  588228 cri.go:89] found id: ""
	I1217 21:20:22.898565  588228 logs.go:282] 0 containers: []
	W1217 21:20:22.898574  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:20:22.898581  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:20:22.898639  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:20:22.935748  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:22.935766  588228 cri.go:89] found id: ""
	I1217 21:20:22.935774  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:20:22.935832  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:22.940006  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:20:22.940132  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:20:22.967888  588228 cri.go:89] found id: ""
	I1217 21:20:22.967965  588228 logs.go:282] 0 containers: []
	W1217 21:20:22.967988  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:20:22.968008  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:20:22.968095  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:20:23.006708  588228 cri.go:89] found id: ""
	I1217 21:20:23.006819  588228 logs.go:282] 0 containers: []
	W1217 21:20:23.006843  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:20:23.006885  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:20:23.006913  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:20:23.025825  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:20:23.025904  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:23.068993  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:20:23.069067  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:23.117230  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:20:23.117307  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:23.168140  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:20:23.168220  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:20:23.201711  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:20:23.201786  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:20:23.251743  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:20:23.251819  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:20:23.367344  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:20:23.367423  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:20:23.447335  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:20:23.447357  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:20:23.447370  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:26.006412  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:26.019316  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:20:26.019387  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:20:26.061065  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:26.061083  588228 cri.go:89] found id: ""
	I1217 21:20:26.061091  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:20:26.061145  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:26.068175  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:20:26.068256  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:20:26.103493  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:26.103511  588228 cri.go:89] found id: ""
	I1217 21:20:26.103519  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:20:26.103574  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:26.112212  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:20:26.112405  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:20:26.155159  588228 cri.go:89] found id: ""
	I1217 21:20:26.155235  588228 logs.go:282] 0 containers: []
	W1217 21:20:26.155261  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:20:26.155280  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:20:26.155370  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:20:26.194794  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:26.194866  588228 cri.go:89] found id: ""
	I1217 21:20:26.194889  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:20:26.194977  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:26.198953  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:20:26.199140  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:20:26.251464  588228 cri.go:89] found id: ""
	I1217 21:20:26.251539  588228 logs.go:282] 0 containers: []
	W1217 21:20:26.251563  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:20:26.251581  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:20:26.251667  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:20:26.321766  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:26.321847  588228 cri.go:89] found id: ""
	I1217 21:20:26.321868  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:20:26.321953  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:26.326877  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:20:26.326992  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:20:26.389569  588228 cri.go:89] found id: ""
	I1217 21:20:26.389651  588228 logs.go:282] 0 containers: []
	W1217 21:20:26.389676  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:20:26.389694  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:20:26.389801  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:20:26.443328  588228 cri.go:89] found id: ""
	I1217 21:20:26.443401  588228 logs.go:282] 0 containers: []
	W1217 21:20:26.443434  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:20:26.443519  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:20:26.443547  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:20:26.532111  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:20:26.532157  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:20:26.547396  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:20:26.547426  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:26.600910  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:20:26.600944  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:26.643475  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:20:26.643514  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:26.709302  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:20:26.709338  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:20:26.791496  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:20:26.791518  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:20:26.791531  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:26.873306  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:20:26.873344  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:20:26.918131  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:20:26.918210  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:20:29.506633  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:29.520591  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:20:29.520662  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:20:29.550632  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:29.550653  588228 cri.go:89] found id: ""
	I1217 21:20:29.550661  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:20:29.550725  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:29.554587  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:20:29.554664  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:20:29.591452  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:29.591526  588228 cri.go:89] found id: ""
	I1217 21:20:29.591547  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:20:29.591635  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:29.595808  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:20:29.595922  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:20:29.630601  588228 cri.go:89] found id: ""
	I1217 21:20:29.630624  588228 logs.go:282] 0 containers: []
	W1217 21:20:29.630633  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:20:29.630638  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:20:29.630696  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:20:29.658486  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:29.658506  588228 cri.go:89] found id: ""
	I1217 21:20:29.658515  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:20:29.658574  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:29.662424  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:20:29.662493  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:20:29.687805  588228 cri.go:89] found id: ""
	I1217 21:20:29.687879  588228 logs.go:282] 0 containers: []
	W1217 21:20:29.687950  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:20:29.687974  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:20:29.688067  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:20:29.717078  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:29.717101  588228 cri.go:89] found id: ""
	I1217 21:20:29.717110  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:20:29.717164  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:29.720996  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:20:29.721122  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:20:29.747976  588228 cri.go:89] found id: ""
	I1217 21:20:29.747999  588228 logs.go:282] 0 containers: []
	W1217 21:20:29.748008  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:20:29.748050  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:20:29.748133  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:20:29.772974  588228 cri.go:89] found id: ""
	I1217 21:20:29.773051  588228 logs.go:282] 0 containers: []
	W1217 21:20:29.773067  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:20:29.773085  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:20:29.773097  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:20:29.804970  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:20:29.804999  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:29.840273  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:20:29.840353  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:20:29.900838  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:20:29.900877  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:20:29.917390  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:20:29.917420  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:20:29.983536  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:20:29.983607  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:20:29.983636  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:30.060062  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:20:30.060149  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:30.135482  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:20:30.135520  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:30.176388  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:20:30.176423  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:20:32.707830  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:32.718014  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:20:32.718087  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:20:32.742097  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:32.742123  588228 cri.go:89] found id: ""
	I1217 21:20:32.742132  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:20:32.742191  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:32.746025  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:20:32.746099  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:20:32.771728  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:32.771748  588228 cri.go:89] found id: ""
	I1217 21:20:32.771756  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:20:32.771815  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:32.775518  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:20:32.775591  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:20:32.800846  588228 cri.go:89] found id: ""
	I1217 21:20:32.800924  588228 logs.go:282] 0 containers: []
	W1217 21:20:32.800940  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:20:32.800948  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:20:32.801011  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:20:32.826263  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:32.826283  588228 cri.go:89] found id: ""
	I1217 21:20:32.826290  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:20:32.826351  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:32.830191  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:20:32.830293  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:20:32.856408  588228 cri.go:89] found id: ""
	I1217 21:20:32.856437  588228 logs.go:282] 0 containers: []
	W1217 21:20:32.856462  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:20:32.856469  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:20:32.856533  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:20:32.880835  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:32.880858  588228 cri.go:89] found id: ""
	I1217 21:20:32.880866  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:20:32.880925  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:32.884736  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:20:32.884854  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:20:32.911887  588228 cri.go:89] found id: ""
	I1217 21:20:32.911952  588228 logs.go:282] 0 containers: []
	W1217 21:20:32.911968  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:20:32.911976  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:20:32.912035  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:20:32.941911  588228 cri.go:89] found id: ""
	I1217 21:20:32.941988  588228 logs.go:282] 0 containers: []
	W1217 21:20:32.942004  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:20:32.942019  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:20:32.942034  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:32.975368  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:20:32.975402  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:33.012470  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:20:33.012519  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:33.047788  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:20:33.047822  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:33.097623  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:20:33.097658  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:20:33.161032  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:20:33.161076  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:20:33.176596  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:20:33.176628  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:20:33.247390  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:20:33.247413  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:20:33.247428  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:20:33.278779  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:20:33.278817  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:20:35.815743  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:35.825889  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:20:35.825957  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:20:35.852066  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:35.852084  588228 cri.go:89] found id: ""
	I1217 21:20:35.852092  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:20:35.852148  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:35.855787  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:20:35.855861  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:20:35.893392  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:35.893413  588228 cri.go:89] found id: ""
	I1217 21:20:35.893421  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:20:35.893475  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:35.897148  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:20:35.897218  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:20:35.922690  588228 cri.go:89] found id: ""
	I1217 21:20:35.922724  588228 logs.go:282] 0 containers: []
	W1217 21:20:35.922734  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:20:35.922740  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:20:35.922796  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:20:35.949409  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:35.949431  588228 cri.go:89] found id: ""
	I1217 21:20:35.949439  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:20:35.949497  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:35.953287  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:20:35.953408  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:20:35.977374  588228 cri.go:89] found id: ""
	I1217 21:20:35.977396  588228 logs.go:282] 0 containers: []
	W1217 21:20:35.977404  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:20:35.977410  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:20:35.977472  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:20:36.003880  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:36.003908  588228 cri.go:89] found id: ""
	I1217 21:20:36.003917  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:20:36.003985  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:36.014710  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:20:36.014808  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:20:36.043639  588228 cri.go:89] found id: ""
	I1217 21:20:36.043666  588228 logs.go:282] 0 containers: []
	W1217 21:20:36.043675  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:20:36.043681  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:20:36.043770  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:20:36.070844  588228 cri.go:89] found id: ""
	I1217 21:20:36.070873  588228 logs.go:282] 0 containers: []
	W1217 21:20:36.070885  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:20:36.070920  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:20:36.070936  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:20:36.129027  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:20:36.129065  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:36.164107  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:20:36.164141  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:36.200371  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:20:36.200411  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:36.239108  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:20:36.239142  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:20:36.270999  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:20:36.271036  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:20:36.300743  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:20:36.300773  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:20:36.318948  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:20:36.318979  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:20:36.384346  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:20:36.384370  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:20:36.384383  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:38.920396  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:38.930725  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:20:38.930796  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:20:38.957048  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:38.957073  588228 cri.go:89] found id: ""
	I1217 21:20:38.957082  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:20:38.957141  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:38.960861  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:20:38.960930  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:20:38.986764  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:38.986788  588228 cri.go:89] found id: ""
	I1217 21:20:38.986796  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:20:38.986854  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:38.990541  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:20:38.990618  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:20:39.020215  588228 cri.go:89] found id: ""
	I1217 21:20:39.020296  588228 logs.go:282] 0 containers: []
	W1217 21:20:39.020323  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:20:39.020342  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:20:39.020425  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:20:39.046610  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:39.046633  588228 cri.go:89] found id: ""
	I1217 21:20:39.046641  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:20:39.046702  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:39.050635  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:20:39.050708  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:20:39.075047  588228 cri.go:89] found id: ""
	I1217 21:20:39.075070  588228 logs.go:282] 0 containers: []
	W1217 21:20:39.075079  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:20:39.075085  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:20:39.075142  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:20:39.104191  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:39.104213  588228 cri.go:89] found id: ""
	I1217 21:20:39.104220  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:20:39.104309  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:39.108124  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:20:39.108195  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:20:39.132227  588228 cri.go:89] found id: ""
	I1217 21:20:39.132302  588228 logs.go:282] 0 containers: []
	W1217 21:20:39.132312  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:20:39.132318  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:20:39.132376  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:20:39.156986  588228 cri.go:89] found id: ""
	I1217 21:20:39.157008  588228 logs.go:282] 0 containers: []
	W1217 21:20:39.157018  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:20:39.157032  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:20:39.157043  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:20:39.214182  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:20:39.214215  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:39.256918  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:20:39.256951  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:39.295864  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:20:39.295897  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:20:39.335569  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:20:39.335594  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:20:39.350331  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:20:39.350359  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:20:39.411098  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:20:39.411117  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:20:39.411131  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:39.443290  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:20:39.443323  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:39.483338  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:20:39.483369  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:20:42.016640  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:42.028464  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:20:42.028559  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:20:42.058646  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:42.058671  588228 cri.go:89] found id: ""
	I1217 21:20:42.058679  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:20:42.058741  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:42.062948  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:20:42.063024  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:20:42.092547  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:42.092571  588228 cri.go:89] found id: ""
	I1217 21:20:42.092581  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:20:42.092644  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:42.097639  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:20:42.097732  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:20:42.137356  588228 cri.go:89] found id: ""
	I1217 21:20:42.137386  588228 logs.go:282] 0 containers: []
	W1217 21:20:42.137396  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:20:42.137403  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:20:42.137475  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:20:42.173707  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:42.173732  588228 cri.go:89] found id: ""
	I1217 21:20:42.173742  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:20:42.173813  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:42.179068  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:20:42.179153  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:20:42.208693  588228 cri.go:89] found id: ""
	I1217 21:20:42.208738  588228 logs.go:282] 0 containers: []
	W1217 21:20:42.208749  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:20:42.208776  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:20:42.208868  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:20:42.252205  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:42.252231  588228 cri.go:89] found id: ""
	I1217 21:20:42.252241  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:20:42.252323  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:42.257100  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:20:42.257235  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:20:42.301639  588228 cri.go:89] found id: ""
	I1217 21:20:42.301744  588228 logs.go:282] 0 containers: []
	W1217 21:20:42.301771  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:20:42.301795  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:20:42.301915  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:20:42.333702  588228 cri.go:89] found id: ""
	I1217 21:20:42.333724  588228 logs.go:282] 0 containers: []
	W1217 21:20:42.333734  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:20:42.333747  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:20:42.333758  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:20:42.363215  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:20:42.363253  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:20:42.433198  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:20:42.433219  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:20:42.433231  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:42.470626  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:20:42.470656  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:20:42.512752  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:20:42.512780  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:20:42.571238  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:20:42.571275  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:20:42.586442  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:20:42.586469  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:42.641087  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:20:42.641118  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:42.676911  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:20:42.676941  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:45.225105  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:45.245674  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:20:45.245779  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:20:45.290630  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:45.290651  588228 cri.go:89] found id: ""
	I1217 21:20:45.290659  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:20:45.290720  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:45.295245  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:20:45.295320  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:20:45.333385  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:45.333405  588228 cri.go:89] found id: ""
	I1217 21:20:45.333413  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:20:45.333467  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:45.337236  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:20:45.337333  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:20:45.362037  588228 cri.go:89] found id: ""
	I1217 21:20:45.362063  588228 logs.go:282] 0 containers: []
	W1217 21:20:45.362072  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:20:45.362078  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:20:45.362135  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:20:45.387727  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:45.387749  588228 cri.go:89] found id: ""
	I1217 21:20:45.387757  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:20:45.387836  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:45.391716  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:20:45.391811  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:20:45.419200  588228 cri.go:89] found id: ""
	I1217 21:20:45.419224  588228 logs.go:282] 0 containers: []
	W1217 21:20:45.419233  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:20:45.419239  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:20:45.419332  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:20:45.445084  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:45.445107  588228 cri.go:89] found id: ""
	I1217 21:20:45.445117  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:20:45.445210  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:45.448841  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:20:45.448952  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:20:45.472891  588228 cri.go:89] found id: ""
	I1217 21:20:45.472914  588228 logs.go:282] 0 containers: []
	W1217 21:20:45.472949  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:20:45.472964  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:20:45.473037  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:20:45.502506  588228 cri.go:89] found id: ""
	I1217 21:20:45.502539  588228 logs.go:282] 0 containers: []
	W1217 21:20:45.502549  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:20:45.502567  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:20:45.502585  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:20:45.517652  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:20:45.517682  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:20:45.587777  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:20:45.587797  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:20:45.587810  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:45.623008  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:20:45.623040  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:45.662199  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:20:45.662230  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:45.697190  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:20:45.697222  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:20:45.727834  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:20:45.727868  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:20:45.757485  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:20:45.757511  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:20:45.818286  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:20:45.818321  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:48.352557  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:48.362978  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:20:48.363050  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:20:48.405418  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:48.405445  588228 cri.go:89] found id: ""
	I1217 21:20:48.405453  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:20:48.405508  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:48.409888  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:20:48.409962  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:20:48.447603  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:48.447630  588228 cri.go:89] found id: ""
	I1217 21:20:48.447638  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:20:48.447695  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:48.452120  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:20:48.452195  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:20:48.478889  588228 cri.go:89] found id: ""
	I1217 21:20:48.478917  588228 logs.go:282] 0 containers: []
	W1217 21:20:48.478926  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:20:48.478938  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:20:48.478998  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:20:48.514679  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:48.514705  588228 cri.go:89] found id: ""
	I1217 21:20:48.514713  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:20:48.514776  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:48.519317  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:20:48.519402  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:20:48.558518  588228 cri.go:89] found id: ""
	I1217 21:20:48.558545  588228 logs.go:282] 0 containers: []
	W1217 21:20:48.558554  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:20:48.558561  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:20:48.558618  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:20:48.590353  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:48.590379  588228 cri.go:89] found id: ""
	I1217 21:20:48.590388  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:20:48.590448  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:48.594916  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:20:48.595014  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:20:48.626947  588228 cri.go:89] found id: ""
	I1217 21:20:48.626975  588228 logs.go:282] 0 containers: []
	W1217 21:20:48.626993  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:20:48.627000  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:20:48.627057  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:20:48.676579  588228 cri.go:89] found id: ""
	I1217 21:20:48.676606  588228 logs.go:282] 0 containers: []
	W1217 21:20:48.676624  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:20:48.676638  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:20:48.676656  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:20:48.692122  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:20:48.692151  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:20:48.773815  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:20:48.773840  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:20:48.773852  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:48.809225  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:20:48.809266  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:20:48.841702  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:20:48.841742  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:20:48.906612  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:20:48.906655  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:48.957112  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:20:48.957144  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:49.000183  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:20:49.000216  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:49.104522  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:20:49.104564  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:20:51.660545  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:51.670449  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:20:51.670517  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:20:51.695487  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:51.695509  588228 cri.go:89] found id: ""
	I1217 21:20:51.695517  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:20:51.695571  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:51.698992  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:20:51.699071  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:20:51.725301  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:51.725321  588228 cri.go:89] found id: ""
	I1217 21:20:51.725329  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:20:51.725386  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:51.728954  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:20:51.729021  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:20:51.757373  588228 cri.go:89] found id: ""
	I1217 21:20:51.757396  588228 logs.go:282] 0 containers: []
	W1217 21:20:51.757405  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:20:51.757412  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:20:51.757468  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:20:51.781944  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:51.781964  588228 cri.go:89] found id: ""
	I1217 21:20:51.781973  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:20:51.782032  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:51.785778  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:20:51.785885  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:20:51.810654  588228 cri.go:89] found id: ""
	I1217 21:20:51.810677  588228 logs.go:282] 0 containers: []
	W1217 21:20:51.810686  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:20:51.810693  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:20:51.810754  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:20:51.835828  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:51.835849  588228 cri.go:89] found id: ""
	I1217 21:20:51.835857  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:20:51.835915  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:51.839636  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:20:51.839708  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:20:51.864079  588228 cri.go:89] found id: ""
	I1217 21:20:51.864104  588228 logs.go:282] 0 containers: []
	W1217 21:20:51.864113  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:20:51.864120  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:20:51.864181  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:20:51.889010  588228 cri.go:89] found id: ""
	I1217 21:20:51.889033  588228 logs.go:282] 0 containers: []
	W1217 21:20:51.889042  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:20:51.889056  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:20:51.889067  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:20:51.946163  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:20:51.946200  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:20:51.962022  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:20:51.962060  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:20:52.054974  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:20:52.055000  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:20:52.055014  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:52.093132  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:20:52.093162  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:52.129081  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:20:52.129113  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:52.165735  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:20:52.165843  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:20:52.195441  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:20:52.195467  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:52.235339  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:20:52.235368  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:20:54.765924  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:54.776392  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:20:54.776470  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:20:54.804217  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:54.804295  588228 cri.go:89] found id: ""
	I1217 21:20:54.804304  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:20:54.804383  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:54.808819  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:20:54.808908  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:20:54.834255  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:54.834280  588228 cri.go:89] found id: ""
	I1217 21:20:54.834289  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:20:54.834349  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:54.838241  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:20:54.838319  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:20:54.867358  588228 cri.go:89] found id: ""
	I1217 21:20:54.867382  588228 logs.go:282] 0 containers: []
	W1217 21:20:54.867391  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:20:54.867397  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:20:54.867467  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:20:54.893428  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:54.893494  588228 cri.go:89] found id: ""
	I1217 21:20:54.893516  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:20:54.893599  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:54.897304  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:20:54.897390  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:20:54.922875  588228 cri.go:89] found id: ""
	I1217 21:20:54.922901  588228 logs.go:282] 0 containers: []
	W1217 21:20:54.922919  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:20:54.922926  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:20:54.922993  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:20:54.949044  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:54.949109  588228 cri.go:89] found id: ""
	I1217 21:20:54.949131  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:20:54.949225  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:54.953103  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:20:54.953180  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:20:54.977615  588228 cri.go:89] found id: ""
	I1217 21:20:54.977650  588228 logs.go:282] 0 containers: []
	W1217 21:20:54.977660  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:20:54.977666  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:20:54.977736  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:20:55.009013  588228 cri.go:89] found id: ""
	I1217 21:20:55.009054  588228 logs.go:282] 0 containers: []
	W1217 21:20:55.009065  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:20:55.009081  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:20:55.009094  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:20:55.027834  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:20:55.027920  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:20:55.114878  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:20:55.114912  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:20:55.114934  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:55.166784  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:20:55.166825  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:55.202363  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:20:55.202434  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:55.245986  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:20:55.246021  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:20:55.290996  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:20:55.291027  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:20:55.348320  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:20:55.348358  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:55.392943  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:20:55.392977  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:20:57.922970  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:20:57.932902  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:20:57.932993  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:20:57.957491  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:57.957513  588228 cri.go:89] found id: ""
	I1217 21:20:57.957522  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:20:57.957578  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:57.961453  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:20:57.961528  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:20:57.989358  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:57.989379  588228 cri.go:89] found id: ""
	I1217 21:20:57.989387  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:20:57.989447  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:57.993509  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:20:57.993577  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:20:58.024522  588228 cri.go:89] found id: ""
	I1217 21:20:58.024549  588228 logs.go:282] 0 containers: []
	W1217 21:20:58.024561  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:20:58.024568  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:20:58.024631  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:20:58.062068  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:58.062094  588228 cri.go:89] found id: ""
	I1217 21:20:58.062104  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:20:58.062161  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:58.069768  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:20:58.069845  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:20:58.095700  588228 cri.go:89] found id: ""
	I1217 21:20:58.095727  588228 logs.go:282] 0 containers: []
	W1217 21:20:58.095737  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:20:58.095744  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:20:58.095813  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:20:58.125695  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:58.125721  588228 cri.go:89] found id: ""
	I1217 21:20:58.125744  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:20:58.125806  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:20:58.129692  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:20:58.129793  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:20:58.155686  588228 cri.go:89] found id: ""
	I1217 21:20:58.155715  588228 logs.go:282] 0 containers: []
	W1217 21:20:58.155724  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:20:58.155730  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:20:58.155791  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:20:58.181151  588228 cri.go:89] found id: ""
	I1217 21:20:58.181176  588228 logs.go:282] 0 containers: []
	W1217 21:20:58.181194  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:20:58.181225  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:20:58.181244  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:20:58.196285  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:20:58.196316  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:20:58.259218  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:20:58.259237  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:20:58.259250  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:20:58.292738  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:20:58.292772  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:20:58.325002  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:20:58.325034  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:20:58.357952  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:20:58.357982  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:20:58.396757  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:20:58.396787  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:20:58.425891  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:20:58.425921  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:20:58.484111  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:20:58.484149  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:21:01.018807  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:21:01.030258  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:21:01.030327  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:21:01.059089  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:01.059110  588228 cri.go:89] found id: ""
	I1217 21:21:01.059127  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:21:01.059185  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:01.066087  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:21:01.066159  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:21:01.091526  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:01.091545  588228 cri.go:89] found id: ""
	I1217 21:21:01.091553  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:21:01.091611  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:01.095374  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:21:01.095454  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:21:01.125404  588228 cri.go:89] found id: ""
	I1217 21:21:01.125426  588228 logs.go:282] 0 containers: []
	W1217 21:21:01.125435  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:21:01.125442  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:21:01.125503  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:21:01.152007  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:01.152028  588228 cri.go:89] found id: ""
	I1217 21:21:01.152036  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:21:01.152107  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:01.156437  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:21:01.156548  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:21:01.186826  588228 cri.go:89] found id: ""
	I1217 21:21:01.186853  588228 logs.go:282] 0 containers: []
	W1217 21:21:01.186862  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:21:01.186869  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:21:01.186934  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:21:01.213936  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:01.213960  588228 cri.go:89] found id: ""
	I1217 21:21:01.213968  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:21:01.214042  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:01.218075  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:21:01.218161  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:21:01.242974  588228 cri.go:89] found id: ""
	I1217 21:21:01.243000  588228 logs.go:282] 0 containers: []
	W1217 21:21:01.243009  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:21:01.243016  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:21:01.243077  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:21:01.272498  588228 cri.go:89] found id: ""
	I1217 21:21:01.272524  588228 logs.go:282] 0 containers: []
	W1217 21:21:01.272535  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:21:01.272550  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:21:01.272562  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:01.311604  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:21:01.311635  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:21:01.341950  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:21:01.341984  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:21:01.402138  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:21:01.402171  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:21:01.417303  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:21:01.417330  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:01.451350  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:21:01.451382  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:01.486020  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:21:01.486050  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:21:01.515011  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:21:01.515039  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:21:01.584789  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:21:01.584813  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:21:01.584826  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:04.124776  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:21:04.135367  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:21:04.135499  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:21:04.161428  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:04.161452  588228 cri.go:89] found id: ""
	I1217 21:21:04.161460  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:21:04.161542  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:04.165440  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:21:04.165523  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:21:04.190398  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:04.190464  588228 cri.go:89] found id: ""
	I1217 21:21:04.190485  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:21:04.190561  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:04.194383  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:21:04.194464  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:21:04.219976  588228 cri.go:89] found id: ""
	I1217 21:21:04.220065  588228 logs.go:282] 0 containers: []
	W1217 21:21:04.220106  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:21:04.220126  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:21:04.220212  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:21:04.246607  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:04.246631  588228 cri.go:89] found id: ""
	I1217 21:21:04.246639  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:21:04.246727  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:04.250532  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:21:04.250621  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:21:04.275992  588228 cri.go:89] found id: ""
	I1217 21:21:04.276019  588228 logs.go:282] 0 containers: []
	W1217 21:21:04.276037  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:21:04.276060  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:21:04.276193  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:21:04.300865  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:04.300887  588228 cri.go:89] found id: ""
	I1217 21:21:04.300897  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:21:04.300976  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:04.304955  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:21:04.305029  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:21:04.330364  588228 cri.go:89] found id: ""
	I1217 21:21:04.330432  588228 logs.go:282] 0 containers: []
	W1217 21:21:04.330449  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:21:04.330457  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:21:04.330524  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:21:04.355595  588228 cri.go:89] found id: ""
	I1217 21:21:04.355620  588228 logs.go:282] 0 containers: []
	W1217 21:21:04.355630  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:21:04.355652  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:21:04.355666  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:21:04.370938  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:21:04.370967  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:04.404837  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:21:04.404868  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:04.439118  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:21:04.439152  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:21:04.467966  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:21:04.468007  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:21:04.500779  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:21:04.500811  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:21:04.558549  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:21:04.558584  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:21:04.631931  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:21:04.631952  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:21:04.631966  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:04.674750  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:21:04.674781  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:07.215851  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:21:07.225831  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:21:07.225898  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:21:07.250403  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:07.250429  588228 cri.go:89] found id: ""
	I1217 21:21:07.250438  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:21:07.250494  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:07.254156  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:21:07.254233  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:21:07.278715  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:07.278737  588228 cri.go:89] found id: ""
	I1217 21:21:07.278746  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:21:07.278801  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:07.282502  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:21:07.282575  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:21:07.307246  588228 cri.go:89] found id: ""
	I1217 21:21:07.307268  588228 logs.go:282] 0 containers: []
	W1217 21:21:07.307278  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:21:07.307286  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:21:07.307344  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:21:07.336687  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:07.336709  588228 cri.go:89] found id: ""
	I1217 21:21:07.336717  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:21:07.336802  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:07.340436  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:21:07.340534  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:21:07.365140  588228 cri.go:89] found id: ""
	I1217 21:21:07.365210  588228 logs.go:282] 0 containers: []
	W1217 21:21:07.365226  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:21:07.365234  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:21:07.365293  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:21:07.389754  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:07.389777  588228 cri.go:89] found id: ""
	I1217 21:21:07.389785  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:21:07.389843  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:07.393890  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:21:07.393965  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:21:07.418585  588228 cri.go:89] found id: ""
	I1217 21:21:07.418607  588228 logs.go:282] 0 containers: []
	W1217 21:21:07.418616  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:21:07.418623  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:21:07.418680  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:21:07.443702  588228 cri.go:89] found id: ""
	I1217 21:21:07.443727  588228 logs.go:282] 0 containers: []
	W1217 21:21:07.443736  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:21:07.443749  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:21:07.443761  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:21:07.500618  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:21:07.500655  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:21:07.581147  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:21:07.581178  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:21:07.581192  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:07.618984  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:21:07.619013  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:07.653021  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:21:07.653051  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:07.686946  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:21:07.686975  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:21:07.702130  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:21:07.702158  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:07.741945  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:21:07.741980  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:21:07.775444  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:21:07.775485  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:21:10.325275  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:21:10.335479  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:21:10.335548  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:21:10.361884  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:10.361962  588228 cri.go:89] found id: ""
	I1217 21:21:10.361985  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:21:10.362074  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:10.365853  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:21:10.366014  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:21:10.398513  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:10.398538  588228 cri.go:89] found id: ""
	I1217 21:21:10.398547  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:21:10.398604  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:10.403055  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:21:10.403180  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:21:10.428480  588228 cri.go:89] found id: ""
	I1217 21:21:10.428504  588228 logs.go:282] 0 containers: []
	W1217 21:21:10.428513  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:21:10.428520  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:21:10.428580  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:21:10.454054  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:10.454074  588228 cri.go:89] found id: ""
	I1217 21:21:10.454083  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:21:10.454141  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:10.458029  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:21:10.458130  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:21:10.483396  588228 cri.go:89] found id: ""
	I1217 21:21:10.483424  588228 logs.go:282] 0 containers: []
	W1217 21:21:10.483433  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:21:10.483440  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:21:10.483507  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:21:10.509787  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:10.509814  588228 cri.go:89] found id: ""
	I1217 21:21:10.509828  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:21:10.509886  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:10.513894  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:21:10.513979  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:21:10.545434  588228 cri.go:89] found id: ""
	I1217 21:21:10.545452  588228 logs.go:282] 0 containers: []
	W1217 21:21:10.545460  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:21:10.545465  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:21:10.545522  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:21:10.578361  588228 cri.go:89] found id: ""
	I1217 21:21:10.578389  588228 logs.go:282] 0 containers: []
	W1217 21:21:10.578406  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:21:10.578422  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:21:10.578440  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:21:10.637181  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:21:10.637217  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:21:10.709146  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:21:10.709167  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:21:10.709181  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:10.744167  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:21:10.744201  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:10.778712  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:21:10.778746  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:21:10.810382  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:21:10.810418  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:21:10.826006  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:21:10.826034  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:10.865719  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:21:10.865751  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:10.900931  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:21:10.900964  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:21:13.432043  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:21:13.442484  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:21:13.442556  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:21:13.471064  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:13.471085  588228 cri.go:89] found id: ""
	I1217 21:21:13.471093  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:21:13.471148  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:13.474744  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:21:13.474812  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:21:13.500146  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:13.500165  588228 cri.go:89] found id: ""
	I1217 21:21:13.500174  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:21:13.500229  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:13.504093  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:21:13.504168  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:21:13.532980  588228 cri.go:89] found id: ""
	I1217 21:21:13.533008  588228 logs.go:282] 0 containers: []
	W1217 21:21:13.533018  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:21:13.533024  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:21:13.533091  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:21:13.559031  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:13.559062  588228 cri.go:89] found id: ""
	I1217 21:21:13.559070  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:21:13.559124  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:13.562722  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:21:13.562792  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:21:13.593099  588228 cri.go:89] found id: ""
	I1217 21:21:13.593122  588228 logs.go:282] 0 containers: []
	W1217 21:21:13.593130  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:21:13.593137  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:21:13.593199  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:21:13.618418  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:13.618437  588228 cri.go:89] found id: ""
	I1217 21:21:13.618446  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:21:13.618506  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:13.622218  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:21:13.622287  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:21:13.647531  588228 cri.go:89] found id: ""
	I1217 21:21:13.647555  588228 logs.go:282] 0 containers: []
	W1217 21:21:13.647564  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:21:13.647571  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:21:13.647659  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:21:13.672171  588228 cri.go:89] found id: ""
	I1217 21:21:13.672195  588228 logs.go:282] 0 containers: []
	W1217 21:21:13.672204  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:21:13.672270  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:21:13.672289  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:21:13.701228  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:21:13.701266  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:21:13.784513  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:21:13.784540  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:21:13.784553  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:21:13.817748  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:21:13.817779  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:21:13.876749  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:21:13.876782  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:21:13.891852  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:21:13.891879  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:13.926430  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:21:13.926461  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:13.960041  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:21:13.960069  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:13.996312  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:21:13.996350  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:16.534767  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:21:16.546635  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:21:16.546702  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:21:16.589372  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:16.589394  588228 cri.go:89] found id: ""
	I1217 21:21:16.589402  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:21:16.589461  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:16.593273  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:21:16.593350  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:21:16.619765  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:16.619788  588228 cri.go:89] found id: ""
	I1217 21:21:16.619796  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:21:16.619853  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:16.623883  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:21:16.623959  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:21:16.651134  588228 cri.go:89] found id: ""
	I1217 21:21:16.651156  588228 logs.go:282] 0 containers: []
	W1217 21:21:16.651165  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:21:16.651173  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:21:16.651233  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:21:16.689280  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:16.689301  588228 cri.go:89] found id: ""
	I1217 21:21:16.689310  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:21:16.689364  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:16.693242  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:21:16.693354  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:21:16.721501  588228 cri.go:89] found id: ""
	I1217 21:21:16.721524  588228 logs.go:282] 0 containers: []
	W1217 21:21:16.721532  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:21:16.721538  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:21:16.721603  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:21:16.752365  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:16.752440  588228 cri.go:89] found id: ""
	I1217 21:21:16.752462  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:21:16.752572  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:16.759341  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:21:16.759466  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:21:16.804431  588228 cri.go:89] found id: ""
	I1217 21:21:16.804454  588228 logs.go:282] 0 containers: []
	W1217 21:21:16.804464  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:21:16.804471  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:21:16.804530  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:21:16.830093  588228 cri.go:89] found id: ""
	I1217 21:21:16.830126  588228 logs.go:282] 0 containers: []
	W1217 21:21:16.830139  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:21:16.830162  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:21:16.830175  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:21:16.861146  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:21:16.861239  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:21:16.925617  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:21:16.925637  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:21:16.925650  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:16.964570  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:21:16.964602  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:17.001156  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:21:17.001186  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:17.054669  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:21:17.054702  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:21:17.113768  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:21:17.113801  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:21:17.129734  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:21:17.129761  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:17.166531  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:21:17.166568  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:21:19.698240  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:21:19.708238  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:21:19.708342  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:21:19.735941  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:19.736006  588228 cri.go:89] found id: ""
	I1217 21:21:19.736028  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:21:19.736098  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:19.740854  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:21:19.740973  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:21:19.778753  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:19.778815  588228 cri.go:89] found id: ""
	I1217 21:21:19.778837  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:21:19.778905  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:19.782905  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:21:19.783014  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:21:19.815940  588228 cri.go:89] found id: ""
	I1217 21:21:19.816008  588228 logs.go:282] 0 containers: []
	W1217 21:21:19.816032  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:21:19.816053  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:21:19.816120  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:21:19.841599  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:19.841623  588228 cri.go:89] found id: ""
	I1217 21:21:19.841630  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:21:19.841715  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:19.845595  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:21:19.845673  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:21:19.870931  588228 cri.go:89] found id: ""
	I1217 21:21:19.871008  588228 logs.go:282] 0 containers: []
	W1217 21:21:19.871035  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:21:19.871053  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:21:19.871153  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:21:19.896571  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:19.896592  588228 cri.go:89] found id: ""
	I1217 21:21:19.896601  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:21:19.896655  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:19.900171  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:21:19.900289  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:21:19.925592  588228 cri.go:89] found id: ""
	I1217 21:21:19.925615  588228 logs.go:282] 0 containers: []
	W1217 21:21:19.925624  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:21:19.925630  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:21:19.925688  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:21:19.955067  588228 cri.go:89] found id: ""
	I1217 21:21:19.955093  588228 logs.go:282] 0 containers: []
	W1217 21:21:19.955104  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:21:19.955118  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:21:19.955135  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:21:19.970245  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:21:19.970272  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:21:20.041075  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:21:20.041140  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:21:20.041159  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:20.088774  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:21:20.088809  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:21:20.121189  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:21:20.121216  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:21:20.179347  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:21:20.179385  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:20.214039  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:21:20.214071  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:20.248469  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:21:20.248502  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:20.284695  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:21:20.284727  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:21:22.813507  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:21:22.823346  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:21:22.823416  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:21:22.849201  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:22.849221  588228 cri.go:89] found id: ""
	I1217 21:21:22.849229  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:21:22.849284  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:22.852909  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:21:22.852974  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:21:22.878508  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:22.878530  588228 cri.go:89] found id: ""
	I1217 21:21:22.878538  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:21:22.878593  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:22.882234  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:21:22.882342  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:21:22.907670  588228 cri.go:89] found id: ""
	I1217 21:21:22.907691  588228 logs.go:282] 0 containers: []
	W1217 21:21:22.907699  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:21:22.907706  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:21:22.907764  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:21:22.932828  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:22.932848  588228 cri.go:89] found id: ""
	I1217 21:21:22.932856  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:21:22.932909  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:22.936351  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:21:22.936479  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:21:22.959870  588228 cri.go:89] found id: ""
	I1217 21:21:22.959938  588228 logs.go:282] 0 containers: []
	W1217 21:21:22.959962  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:21:22.959980  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:21:22.960060  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:21:22.985642  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:22.985664  588228 cri.go:89] found id: ""
	I1217 21:21:22.985672  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:21:22.985747  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:22.989337  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:21:22.989404  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:21:23.022404  588228 cri.go:89] found id: ""
	I1217 21:21:23.022427  588228 logs.go:282] 0 containers: []
	W1217 21:21:23.022436  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:21:23.022448  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:21:23.022510  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:21:23.048553  588228 cri.go:89] found id: ""
	I1217 21:21:23.048576  588228 logs.go:282] 0 containers: []
	W1217 21:21:23.048585  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:21:23.048599  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:21:23.048611  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:21:23.113751  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:21:23.113772  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:21:23.113785  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:23.149473  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:21:23.149505  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:23.186043  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:21:23.186075  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:23.219536  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:21:23.219566  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:23.256667  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:21:23.256699  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:21:23.285629  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:21:23.285655  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:21:23.343521  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:21:23.343578  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:21:23.358740  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:21:23.358769  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:21:25.888785  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:21:25.901103  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:21:25.901171  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:21:25.932371  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:25.932390  588228 cri.go:89] found id: ""
	I1217 21:21:25.932398  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:21:25.932455  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:25.936650  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:21:25.936723  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:21:25.972034  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:25.972115  588228 cri.go:89] found id: ""
	I1217 21:21:25.972137  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:21:25.972229  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:25.976649  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:21:25.976722  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:21:26.018763  588228 cri.go:89] found id: ""
	I1217 21:21:26.018787  588228 logs.go:282] 0 containers: []
	W1217 21:21:26.018796  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:21:26.018803  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:21:26.018863  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:21:26.051778  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:26.051799  588228 cri.go:89] found id: ""
	I1217 21:21:26.051807  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:21:26.051863  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:26.056113  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:21:26.056182  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:21:26.086376  588228 cri.go:89] found id: ""
	I1217 21:21:26.086457  588228 logs.go:282] 0 containers: []
	W1217 21:21:26.086481  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:21:26.086499  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:21:26.086606  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:21:26.123892  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:26.123914  588228 cri.go:89] found id: ""
	I1217 21:21:26.123923  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:21:26.123990  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:26.138540  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:21:26.138656  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:21:26.170727  588228 cri.go:89] found id: ""
	I1217 21:21:26.170806  588228 logs.go:282] 0 containers: []
	W1217 21:21:26.170835  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:21:26.170854  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:21:26.170963  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:21:26.206226  588228 cri.go:89] found id: ""
	I1217 21:21:26.206314  588228 logs.go:282] 0 containers: []
	W1217 21:21:26.206338  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:21:26.206379  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:21:26.206407  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:21:26.283939  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:21:26.284048  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:21:26.373412  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:21:26.373432  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:21:26.373447  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:26.422105  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:21:26.422180  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:26.473541  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:21:26.473623  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:26.575598  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:21:26.575679  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:21:26.626306  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:21:26.626401  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:21:26.646662  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:21:26.646739  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:26.713278  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:21:26.713366  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:21:29.267793  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:21:29.278236  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:21:29.278306  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:21:29.303263  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:29.303285  588228 cri.go:89] found id: ""
	I1217 21:21:29.303294  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:21:29.303361  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:29.307325  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:21:29.307412  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:21:29.332628  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:29.332651  588228 cri.go:89] found id: ""
	I1217 21:21:29.332660  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:21:29.332717  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:29.336496  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:21:29.336571  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:21:29.362066  588228 cri.go:89] found id: ""
	I1217 21:21:29.362089  588228 logs.go:282] 0 containers: []
	W1217 21:21:29.362099  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:21:29.362105  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:21:29.362166  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:21:29.388207  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:29.388230  588228 cri.go:89] found id: ""
	I1217 21:21:29.388238  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:21:29.388336  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:29.392037  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:21:29.392146  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:21:29.417094  588228 cri.go:89] found id: ""
	I1217 21:21:29.417119  588228 logs.go:282] 0 containers: []
	W1217 21:21:29.417128  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:21:29.417134  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:21:29.417193  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:21:29.443762  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:29.443786  588228 cri.go:89] found id: ""
	I1217 21:21:29.443800  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:21:29.443857  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:29.447693  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:21:29.447792  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:21:29.473566  588228 cri.go:89] found id: ""
	I1217 21:21:29.473601  588228 logs.go:282] 0 containers: []
	W1217 21:21:29.473611  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:21:29.473618  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:21:29.473728  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:21:29.516018  588228 cri.go:89] found id: ""
	I1217 21:21:29.516044  588228 logs.go:282] 0 containers: []
	W1217 21:21:29.516052  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:21:29.516066  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:21:29.516079  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:29.549695  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:21:29.549728  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:21:29.614500  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:21:29.614576  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:21:29.686182  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:21:29.686204  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:21:29.686217  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:29.721770  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:21:29.721803  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:29.756934  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:21:29.756967  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:29.792833  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:21:29.792868  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:21:29.822699  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:21:29.822733  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:21:29.883215  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:21:29.883250  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:21:32.400628  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:21:32.410501  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:21:32.410573  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:21:32.435983  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:32.436005  588228 cri.go:89] found id: ""
	I1217 21:21:32.436013  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:21:32.436071  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:32.439801  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:21:32.439871  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:21:32.465477  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:32.465497  588228 cri.go:89] found id: ""
	I1217 21:21:32.465505  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:21:32.465561  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:32.469447  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:21:32.469517  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:21:32.505329  588228 cri.go:89] found id: ""
	I1217 21:21:32.505351  588228 logs.go:282] 0 containers: []
	W1217 21:21:32.505360  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:21:32.505367  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:21:32.505425  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:21:32.533906  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:32.533926  588228 cri.go:89] found id: ""
	I1217 21:21:32.533933  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:21:32.533988  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:32.538199  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:21:32.538271  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:21:32.577612  588228 cri.go:89] found id: ""
	I1217 21:21:32.577640  588228 logs.go:282] 0 containers: []
	W1217 21:21:32.577656  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:21:32.577663  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:21:32.577723  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:21:32.607492  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:32.607515  588228 cri.go:89] found id: ""
	I1217 21:21:32.607524  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:21:32.607613  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:32.611412  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:21:32.611486  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:21:32.640171  588228 cri.go:89] found id: ""
	I1217 21:21:32.640200  588228 logs.go:282] 0 containers: []
	W1217 21:21:32.640209  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:21:32.640215  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:21:32.640345  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:21:32.664885  588228 cri.go:89] found id: ""
	I1217 21:21:32.664909  588228 logs.go:282] 0 containers: []
	W1217 21:21:32.664917  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:21:32.664931  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:21:32.664944  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:32.699532  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:21:32.699563  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:21:32.727918  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:21:32.727949  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:21:32.789703  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:21:32.789741  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:21:32.854783  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:21:32.854855  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:21:32.854875  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:21:32.897417  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:21:32.897443  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:21:32.912657  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:21:32.912698  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:32.959198  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:21:32.959230  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:32.997784  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:21:32.997817  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:35.532380  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:21:35.544293  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:21:35.544389  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:21:35.577859  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:35.577894  588228 cri.go:89] found id: ""
	I1217 21:21:35.577903  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:21:35.577968  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:35.581776  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:21:35.581894  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:21:35.607739  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:35.607775  588228 cri.go:89] found id: ""
	I1217 21:21:35.607783  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:21:35.607849  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:35.611736  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:21:35.611849  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:21:35.637757  588228 cri.go:89] found id: ""
	I1217 21:21:35.637823  588228 logs.go:282] 0 containers: []
	W1217 21:21:35.637844  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:21:35.637875  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:21:35.637951  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:21:35.663957  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:35.663977  588228 cri.go:89] found id: ""
	I1217 21:21:35.663985  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:21:35.664042  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:35.667692  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:21:35.667776  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:21:35.691529  588228 cri.go:89] found id: ""
	I1217 21:21:35.691552  588228 logs.go:282] 0 containers: []
	W1217 21:21:35.691561  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:21:35.691567  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:21:35.691623  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:21:35.717757  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:35.717791  588228 cri.go:89] found id: ""
	I1217 21:21:35.717801  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:21:35.717868  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:35.721908  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:21:35.722022  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:21:35.746146  588228 cri.go:89] found id: ""
	I1217 21:21:35.746213  588228 logs.go:282] 0 containers: []
	W1217 21:21:35.746237  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:21:35.746255  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:21:35.746343  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:21:35.771615  588228 cri.go:89] found id: ""
	I1217 21:21:35.771641  588228 logs.go:282] 0 containers: []
	W1217 21:21:35.771650  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:21:35.771663  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:21:35.771676  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:21:35.829386  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:21:35.829420  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:35.874889  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:21:35.874921  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:35.916598  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:21:35.916632  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:35.953012  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:21:35.953043  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:35.988662  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:21:35.988692  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:21:36.019334  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:21:36.019372  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:21:36.034951  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:21:36.034979  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:21:36.100740  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:21:36.100813  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:21:36.100842  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:21:38.640058  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:21:38.650349  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:21:38.650421  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:21:38.679322  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:38.679343  588228 cri.go:89] found id: ""
	I1217 21:21:38.679351  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:21:38.679409  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:38.683106  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:21:38.683182  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:21:38.709038  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:38.709062  588228 cri.go:89] found id: ""
	I1217 21:21:38.709072  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:21:38.709128  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:38.713028  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:21:38.713102  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:21:38.738238  588228 cri.go:89] found id: ""
	I1217 21:21:38.738260  588228 logs.go:282] 0 containers: []
	W1217 21:21:38.738269  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:21:38.738275  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:21:38.738337  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:21:38.767722  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:38.767796  588228 cri.go:89] found id: ""
	I1217 21:21:38.767821  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:21:38.767894  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:38.771763  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:21:38.771880  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:21:38.797483  588228 cri.go:89] found id: ""
	I1217 21:21:38.797505  588228 logs.go:282] 0 containers: []
	W1217 21:21:38.797515  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:21:38.797521  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:21:38.797583  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:21:38.823214  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:38.823234  588228 cri.go:89] found id: ""
	I1217 21:21:38.823242  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:21:38.823296  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:38.826931  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:21:38.826999  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:21:38.852034  588228 cri.go:89] found id: ""
	I1217 21:21:38.852056  588228 logs.go:282] 0 containers: []
	W1217 21:21:38.852065  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:21:38.852072  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:21:38.852131  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:21:38.877345  588228 cri.go:89] found id: ""
	I1217 21:21:38.877367  588228 logs.go:282] 0 containers: []
	W1217 21:21:38.877376  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:21:38.877392  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:21:38.877404  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:21:38.934792  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:21:38.934827  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:21:38.996933  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:21:38.996953  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:21:38.996967  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:39.031246  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:21:39.031277  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:39.063131  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:21:39.063165  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:39.104865  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:21:39.104897  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:21:39.143126  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:21:39.143161  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:21:39.158746  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:21:39.158773  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:39.190796  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:21:39.190825  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:21:41.720039  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:21:41.729919  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:21:41.729989  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:21:41.755152  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:41.755174  588228 cri.go:89] found id: ""
	I1217 21:21:41.755182  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:21:41.755246  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:41.759011  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:21:41.759084  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:21:41.785116  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:41.785139  588228 cri.go:89] found id: ""
	I1217 21:21:41.785148  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:21:41.785204  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:41.788970  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:21:41.789043  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:21:41.816031  588228 cri.go:89] found id: ""
	I1217 21:21:41.816054  588228 logs.go:282] 0 containers: []
	W1217 21:21:41.816063  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:21:41.816070  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:21:41.816151  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:21:41.845859  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:41.845881  588228 cri.go:89] found id: ""
	I1217 21:21:41.845889  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:21:41.845949  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:41.849869  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:21:41.849944  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:21:41.875379  588228 cri.go:89] found id: ""
	I1217 21:21:41.875403  588228 logs.go:282] 0 containers: []
	W1217 21:21:41.875422  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:21:41.875428  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:21:41.875497  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:21:41.900359  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:41.900390  588228 cri.go:89] found id: ""
	I1217 21:21:41.900398  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:21:41.900462  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:41.904181  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:21:41.904314  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:21:41.927792  588228 cri.go:89] found id: ""
	I1217 21:21:41.927820  588228 logs.go:282] 0 containers: []
	W1217 21:21:41.927829  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:21:41.927837  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:21:41.927897  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:21:41.953750  588228 cri.go:89] found id: ""
	I1217 21:21:41.953772  588228 logs.go:282] 0 containers: []
	W1217 21:21:41.953780  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:21:41.953794  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:21:41.953825  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:21:42.029974  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:21:42.030019  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:21:42.047928  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:21:42.047957  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:42.084064  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:21:42.084105  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:42.121714  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:21:42.121759  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:42.164774  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:21:42.166358  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:21:42.228393  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:21:42.228426  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:21:42.321478  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:21:42.321560  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:21:42.321581  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:42.370468  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:21:42.370498  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:21:44.901073  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:21:44.911165  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:21:44.911236  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:21:44.940319  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:44.940347  588228 cri.go:89] found id: ""
	I1217 21:21:44.940355  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:21:44.940410  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:44.944109  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:21:44.944182  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:21:44.968684  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:44.968708  588228 cri.go:89] found id: ""
	I1217 21:21:44.968715  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:21:44.968771  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:44.972510  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:21:44.972585  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:21:44.997840  588228 cri.go:89] found id: ""
	I1217 21:21:44.997907  588228 logs.go:282] 0 containers: []
	W1217 21:21:44.997924  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:21:44.997932  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:21:44.997991  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:21:45.075162  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:45.075256  588228 cri.go:89] found id: ""
	I1217 21:21:45.075282  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:21:45.075370  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:45.081026  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:21:45.081107  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:21:45.119517  588228 cri.go:89] found id: ""
	I1217 21:21:45.119617  588228 logs.go:282] 0 containers: []
	W1217 21:21:45.119634  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:21:45.119644  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:21:45.119764  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:21:45.164101  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:45.164192  588228 cri.go:89] found id: ""
	I1217 21:21:45.164212  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:21:45.164332  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:45.170126  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:21:45.170214  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:21:45.208766  588228 cri.go:89] found id: ""
	I1217 21:21:45.208832  588228 logs.go:282] 0 containers: []
	W1217 21:21:45.208871  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:21:45.208880  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:21:45.209000  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:21:45.281628  588228 cri.go:89] found id: ""
	I1217 21:21:45.281707  588228 logs.go:282] 0 containers: []
	W1217 21:21:45.281731  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:21:45.281760  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:21:45.281806  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:21:45.301394  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:21:45.301472  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:45.353758  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:21:45.353793  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:21:45.383313  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:21:45.383346  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:21:45.415986  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:21:45.416014  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:21:45.481659  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:21:45.481697  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:21:45.548115  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:21:45.548138  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:21:45.548156  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:45.592575  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:21:45.592615  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:45.629380  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:21:45.629413  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:48.170919  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:21:48.180692  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:21:48.180765  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:21:48.205097  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:48.205119  588228 cri.go:89] found id: ""
	I1217 21:21:48.205128  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:21:48.205186  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:48.208582  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:21:48.208650  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:21:48.232523  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:48.232547  588228 cri.go:89] found id: ""
	I1217 21:21:48.232555  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:21:48.232609  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:48.236396  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:21:48.236464  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:21:48.271972  588228 cri.go:89] found id: ""
	I1217 21:21:48.271998  588228 logs.go:282] 0 containers: []
	W1217 21:21:48.272007  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:21:48.272013  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:21:48.272071  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:21:48.305582  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:48.305604  588228 cri.go:89] found id: ""
	I1217 21:21:48.305612  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:21:48.305667  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:48.309248  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:21:48.309318  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:21:48.333168  588228 cri.go:89] found id: ""
	I1217 21:21:48.333191  588228 logs.go:282] 0 containers: []
	W1217 21:21:48.333199  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:21:48.333206  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:21:48.333266  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:21:48.362460  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:48.362481  588228 cri.go:89] found id: ""
	I1217 21:21:48.362489  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:21:48.362545  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:48.366342  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:21:48.366418  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:21:48.391454  588228 cri.go:89] found id: ""
	I1217 21:21:48.391477  588228 logs.go:282] 0 containers: []
	W1217 21:21:48.391486  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:21:48.391492  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:21:48.391550  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:21:48.415073  588228 cri.go:89] found id: ""
	I1217 21:21:48.415135  588228 logs.go:282] 0 containers: []
	W1217 21:21:48.415151  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:21:48.415168  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:21:48.415180  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:21:48.483608  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:21:48.483683  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:21:48.483700  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:48.518187  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:21:48.518219  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:48.554180  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:21:48.554249  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:21:48.588135  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:21:48.588166  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:21:48.629826  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:21:48.629896  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:21:48.690296  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:21:48.690330  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:21:48.705381  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:21:48.705409  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:48.744757  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:21:48.744787  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:51.284370  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:21:51.307774  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:21:51.307850  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:21:51.352734  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:51.352756  588228 cri.go:89] found id: ""
	I1217 21:21:51.352765  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:21:51.352820  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:51.356559  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:21:51.356628  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:21:51.381974  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:51.381995  588228 cri.go:89] found id: ""
	I1217 21:21:51.382003  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:21:51.382059  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:51.385575  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:21:51.385650  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:21:51.410125  588228 cri.go:89] found id: ""
	I1217 21:21:51.410148  588228 logs.go:282] 0 containers: []
	W1217 21:21:51.410157  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:21:51.410164  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:21:51.410228  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:21:51.435513  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:51.435534  588228 cri.go:89] found id: ""
	I1217 21:21:51.435543  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:21:51.435600  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:51.439290  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:21:51.439389  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:21:51.468406  588228 cri.go:89] found id: ""
	I1217 21:21:51.468429  588228 logs.go:282] 0 containers: []
	W1217 21:21:51.468438  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:21:51.468444  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:21:51.468506  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:21:51.494429  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:51.494451  588228 cri.go:89] found id: ""
	I1217 21:21:51.494459  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:21:51.494519  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:51.498394  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:21:51.498764  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:21:51.529217  588228 cri.go:89] found id: ""
	I1217 21:21:51.529290  588228 logs.go:282] 0 containers: []
	W1217 21:21:51.529315  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:21:51.529334  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:21:51.529430  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:21:51.556105  588228 cri.go:89] found id: ""
	I1217 21:21:51.556182  588228 logs.go:282] 0 containers: []
	W1217 21:21:51.556206  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:21:51.556240  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:21:51.556304  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:21:51.632366  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:21:51.632404  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:21:51.647329  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:21:51.647358  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:51.681834  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:21:51.681868  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:51.713079  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:21:51.713110  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:51.748925  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:21:51.748957  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:21:51.790848  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:21:51.790874  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:21:51.864412  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:21:51.864437  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:21:51.864450  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:51.897749  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:21:51.897781  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:21:54.428731  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:21:54.439858  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:21:54.439929  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:21:54.469468  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:54.469493  588228 cri.go:89] found id: ""
	I1217 21:21:54.469502  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:21:54.469559  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:54.473743  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:21:54.473822  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:21:54.517473  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:54.517497  588228 cri.go:89] found id: ""
	I1217 21:21:54.517505  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:21:54.517559  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:54.522496  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:21:54.522569  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:21:54.583405  588228 cri.go:89] found id: ""
	I1217 21:21:54.583434  588228 logs.go:282] 0 containers: []
	W1217 21:21:54.583447  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:21:54.583459  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:21:54.583524  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:21:54.638829  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:54.638854  588228 cri.go:89] found id: ""
	I1217 21:21:54.638862  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:21:54.638935  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:54.644226  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:21:54.644438  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:21:54.683879  588228 cri.go:89] found id: ""
	I1217 21:21:54.683905  588228 logs.go:282] 0 containers: []
	W1217 21:21:54.683944  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:21:54.683963  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:21:54.684052  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:21:54.719953  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:54.719975  588228 cri.go:89] found id: ""
	I1217 21:21:54.719987  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:21:54.720074  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:54.724688  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:21:54.724802  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:21:54.767917  588228 cri.go:89] found id: ""
	I1217 21:21:54.767947  588228 logs.go:282] 0 containers: []
	W1217 21:21:54.767957  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:21:54.767999  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:21:54.768092  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:21:54.800317  588228 cri.go:89] found id: ""
	I1217 21:21:54.800355  588228 logs.go:282] 0 containers: []
	W1217 21:21:54.800365  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:21:54.800432  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:21:54.800451  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:21:54.819038  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:21:54.819081  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:21:54.892973  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:21:54.892996  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:21:54.893009  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:54.927178  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:21:54.927216  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:21:54.956836  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:21:54.956874  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:21:54.997790  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:21:54.997837  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:21:55.084667  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:21:55.084716  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:55.121466  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:21:55.121499  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:55.154522  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:21:55.154555  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:57.691008  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:21:57.702738  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:21:57.702815  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:21:57.731317  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:57.731337  588228 cri.go:89] found id: ""
	I1217 21:21:57.731345  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:21:57.731401  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:57.735531  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:21:57.735602  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:21:57.775069  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:57.775088  588228 cri.go:89] found id: ""
	I1217 21:21:57.775096  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:21:57.775152  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:57.779302  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:21:57.779371  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:21:57.830474  588228 cri.go:89] found id: ""
	I1217 21:21:57.830499  588228 logs.go:282] 0 containers: []
	W1217 21:21:57.830507  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:21:57.830514  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:21:57.830569  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:21:57.863911  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:57.863930  588228 cri.go:89] found id: ""
	I1217 21:21:57.863939  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:21:57.863993  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:57.868395  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:21:57.868465  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:21:57.926432  588228 cri.go:89] found id: ""
	I1217 21:21:57.926455  588228 logs.go:282] 0 containers: []
	W1217 21:21:57.926464  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:21:57.926471  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:21:57.926614  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:21:57.959448  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:57.959467  588228 cri.go:89] found id: ""
	I1217 21:21:57.959475  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:21:57.959532  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:21:57.963604  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:21:57.963674  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:21:58.041814  588228 cri.go:89] found id: ""
	I1217 21:21:58.041836  588228 logs.go:282] 0 containers: []
	W1217 21:21:58.041844  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:21:58.041850  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:21:58.041914  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:21:58.098698  588228 cri.go:89] found id: ""
	I1217 21:21:58.098719  588228 logs.go:282] 0 containers: []
	W1217 21:21:58.098728  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:21:58.098744  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:21:58.098756  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:21:58.115174  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:21:58.115239  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:21:58.169727  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:21:58.169785  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:21:58.215998  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:21:58.216085  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:21:58.274665  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:21:58.274742  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:21:58.318622  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:21:58.318647  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:21:58.383854  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:21:58.383888  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:21:58.452211  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:21:58.452319  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:21:58.452348  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:21:58.484215  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:21:58.484258  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:01.013665  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:22:01.033930  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:22:01.033988  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:22:01.077300  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:01.077320  588228 cri.go:89] found id: ""
	I1217 21:22:01.077328  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:22:01.077384  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:01.088890  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:22:01.089024  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:22:01.125614  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:01.125635  588228 cri.go:89] found id: ""
	I1217 21:22:01.125642  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:22:01.125698  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:01.129917  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:22:01.130049  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:22:01.161311  588228 cri.go:89] found id: ""
	I1217 21:22:01.161389  588228 logs.go:282] 0 containers: []
	W1217 21:22:01.161412  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:22:01.161431  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:22:01.161572  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:22:01.200581  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:01.200650  588228 cri.go:89] found id: ""
	I1217 21:22:01.200672  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:22:01.200760  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:01.205277  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:22:01.205432  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:22:01.254605  588228 cri.go:89] found id: ""
	I1217 21:22:01.254681  588228 logs.go:282] 0 containers: []
	W1217 21:22:01.254704  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:22:01.254723  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:22:01.254807  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:22:01.304962  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:01.305032  588228 cri.go:89] found id: ""
	I1217 21:22:01.305053  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:22:01.305142  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:01.313413  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:22:01.313539  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:22:01.358582  588228 cri.go:89] found id: ""
	I1217 21:22:01.358603  588228 logs.go:282] 0 containers: []
	W1217 21:22:01.358612  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:22:01.358618  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:22:01.358679  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:22:01.400290  588228 cri.go:89] found id: ""
	I1217 21:22:01.400313  588228 logs.go:282] 0 containers: []
	W1217 21:22:01.400321  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:22:01.400337  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:22:01.400348  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:01.436068  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:22:01.436145  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:22:01.467196  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:22:01.467225  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:22:01.534748  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:22:01.534886  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:22:01.553000  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:22:01.553027  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:22:01.644921  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:22:01.644940  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:22:01.644952  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:01.698483  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:22:01.698560  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:01.751784  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:22:01.751865  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:01.853360  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:22:01.853432  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:04.404065  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:22:04.414375  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:22:04.414448  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:22:04.439862  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:04.439884  588228 cri.go:89] found id: ""
	I1217 21:22:04.439892  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:22:04.439949  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:04.443877  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:22:04.443992  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:22:04.474021  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:04.474042  588228 cri.go:89] found id: ""
	I1217 21:22:04.474056  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:22:04.474115  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:04.477943  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:22:04.478031  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:22:04.504574  588228 cri.go:89] found id: ""
	I1217 21:22:04.504639  588228 logs.go:282] 0 containers: []
	W1217 21:22:04.504662  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:22:04.504681  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:22:04.504767  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:22:04.538225  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:04.538250  588228 cri.go:89] found id: ""
	I1217 21:22:04.538258  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:22:04.538317  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:04.542705  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:22:04.542789  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:22:04.585983  588228 cri.go:89] found id: ""
	I1217 21:22:04.586007  588228 logs.go:282] 0 containers: []
	W1217 21:22:04.586015  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:22:04.586022  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:22:04.586079  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:22:04.624966  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:04.625026  588228 cri.go:89] found id: ""
	I1217 21:22:04.625068  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:22:04.625197  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:04.631653  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:22:04.631836  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:22:04.679653  588228 cri.go:89] found id: ""
	I1217 21:22:04.679677  588228 logs.go:282] 0 containers: []
	W1217 21:22:04.679686  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:22:04.679692  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:22:04.679753  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:22:04.713860  588228 cri.go:89] found id: ""
	I1217 21:22:04.713886  588228 logs.go:282] 0 containers: []
	W1217 21:22:04.713894  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:22:04.713907  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:22:04.713918  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:04.762932  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:22:04.762964  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:04.879732  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:22:04.879766  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:22:04.956658  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:22:04.956747  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:22:04.974011  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:22:04.974039  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:22:05.062550  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:22:05.062571  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:22:05.062585  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:05.104738  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:22:05.104816  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:05.161642  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:22:05.161722  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:05.204061  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:22:05.204097  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:22:07.775042  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:22:07.785821  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:22:07.785891  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:22:07.810247  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:07.810271  588228 cri.go:89] found id: ""
	I1217 21:22:07.810279  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:22:07.810337  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:07.813995  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:22:07.814069  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:22:07.839792  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:07.839819  588228 cri.go:89] found id: ""
	I1217 21:22:07.839827  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:22:07.839916  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:07.843786  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:22:07.843910  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:22:07.869755  588228 cri.go:89] found id: ""
	I1217 21:22:07.869781  588228 logs.go:282] 0 containers: []
	W1217 21:22:07.869789  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:22:07.869798  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:22:07.869859  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:22:07.894819  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:07.894841  588228 cri.go:89] found id: ""
	I1217 21:22:07.894849  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:22:07.894913  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:07.898665  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:22:07.898740  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:22:07.924839  588228 cri.go:89] found id: ""
	I1217 21:22:07.924863  588228 logs.go:282] 0 containers: []
	W1217 21:22:07.924871  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:22:07.924878  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:22:07.924941  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:22:07.950073  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:07.950098  588228 cri.go:89] found id: ""
	I1217 21:22:07.950106  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:22:07.950186  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:07.953976  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:22:07.954045  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:22:07.978888  588228 cri.go:89] found id: ""
	I1217 21:22:07.978910  588228 logs.go:282] 0 containers: []
	W1217 21:22:07.978920  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:22:07.978926  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:22:07.978992  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:22:08.004079  588228 cri.go:89] found id: ""
	I1217 21:22:08.004105  588228 logs.go:282] 0 containers: []
	W1217 21:22:08.004116  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:22:08.004133  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:22:08.004146  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:22:08.022491  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:22:08.022520  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:22:08.087015  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:22:08.087034  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:22:08.087046  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:08.120512  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:22:08.120544  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:08.168736  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:22:08.168768  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:08.206577  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:22:08.206676  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:22:08.274306  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:22:08.274388  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:08.310073  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:22:08.310155  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:08.377203  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:22:08.377253  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:22:10.917260  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:22:10.927597  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:22:10.927663  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:22:10.953384  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:10.953405  588228 cri.go:89] found id: ""
	I1217 21:22:10.953413  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:22:10.953471  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:10.957303  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:22:10.957378  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:22:10.981952  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:10.981973  588228 cri.go:89] found id: ""
	I1217 21:22:10.981981  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:22:10.982037  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:10.985792  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:22:10.985865  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:22:11.012817  588228 cri.go:89] found id: ""
	I1217 21:22:11.012842  588228 logs.go:282] 0 containers: []
	W1217 21:22:11.012852  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:22:11.012858  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:22:11.012923  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:22:11.038612  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:11.038680  588228 cri.go:89] found id: ""
	I1217 21:22:11.038693  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:22:11.038761  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:11.042488  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:22:11.042566  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:22:11.068060  588228 cri.go:89] found id: ""
	I1217 21:22:11.068136  588228 logs.go:282] 0 containers: []
	W1217 21:22:11.068160  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:22:11.068178  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:22:11.068290  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:22:11.094046  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:11.094069  588228 cri.go:89] found id: ""
	I1217 21:22:11.094077  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:22:11.094133  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:11.098017  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:22:11.098104  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:22:11.124028  588228 cri.go:89] found id: ""
	I1217 21:22:11.124051  588228 logs.go:282] 0 containers: []
	W1217 21:22:11.124060  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:22:11.124067  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:22:11.124128  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:22:11.159684  588228 cri.go:89] found id: ""
	I1217 21:22:11.159709  588228 logs.go:282] 0 containers: []
	W1217 21:22:11.159720  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:22:11.159733  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:22:11.159745  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:22:11.219377  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:22:11.219416  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:22:11.235056  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:22:11.235093  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:11.279738  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:22:11.279769  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:11.317585  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:22:11.317617  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:11.356527  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:22:11.356565  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:22:11.386997  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:22:11.387022  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:22:11.452239  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:22:11.452279  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:22:11.452293  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:11.486907  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:22:11.486981  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:14.021829  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:22:14.031947  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:22:14.032017  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:22:14.057543  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:14.057570  588228 cri.go:89] found id: ""
	I1217 21:22:14.057578  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:22:14.057634  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:14.061970  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:22:14.062048  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:22:14.087378  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:14.087401  588228 cri.go:89] found id: ""
	I1217 21:22:14.087410  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:22:14.087464  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:14.091627  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:22:14.091696  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:22:14.117476  588228 cri.go:89] found id: ""
	I1217 21:22:14.117499  588228 logs.go:282] 0 containers: []
	W1217 21:22:14.117508  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:22:14.117514  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:22:14.117578  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:22:14.144081  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:14.144100  588228 cri.go:89] found id: ""
	I1217 21:22:14.144108  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:22:14.144163  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:14.147826  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:22:14.147895  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:22:14.174135  588228 cri.go:89] found id: ""
	I1217 21:22:14.174156  588228 logs.go:282] 0 containers: []
	W1217 21:22:14.174165  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:22:14.174171  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:22:14.174232  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:22:14.203344  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:14.203368  588228 cri.go:89] found id: ""
	I1217 21:22:14.203376  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:22:14.203439  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:14.207100  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:22:14.207181  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:22:14.232350  588228 cri.go:89] found id: ""
	I1217 21:22:14.232371  588228 logs.go:282] 0 containers: []
	W1217 21:22:14.232379  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:22:14.232386  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:22:14.232442  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:22:14.258938  588228 cri.go:89] found id: ""
	I1217 21:22:14.258960  588228 logs.go:282] 0 containers: []
	W1217 21:22:14.258969  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:22:14.258982  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:22:14.258994  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:14.293778  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:22:14.293810  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:22:14.322474  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:22:14.322503  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:22:14.380411  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:22:14.380448  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:14.416201  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:22:14.416233  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:14.446832  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:22:14.446867  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:22:14.462579  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:22:14.462604  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:22:14.560697  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:22:14.560755  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:22:14.560791  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:14.596961  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:22:14.596998  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:17.132721  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:22:17.143591  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:22:17.143656  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:22:17.172073  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:17.172092  588228 cri.go:89] found id: ""
	I1217 21:22:17.172100  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:22:17.172156  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:17.176480  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:22:17.176554  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:22:17.205445  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:17.205464  588228 cri.go:89] found id: ""
	I1217 21:22:17.205472  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:22:17.205527  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:17.209817  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:22:17.209883  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:22:17.237139  588228 cri.go:89] found id: ""
	I1217 21:22:17.237162  588228 logs.go:282] 0 containers: []
	W1217 21:22:17.237171  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:22:17.237178  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:22:17.237241  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:22:17.266582  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:17.266600  588228 cri.go:89] found id: ""
	I1217 21:22:17.266608  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:22:17.266672  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:17.271001  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:22:17.271077  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:22:17.306588  588228 cri.go:89] found id: ""
	I1217 21:22:17.306609  588228 logs.go:282] 0 containers: []
	W1217 21:22:17.306618  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:22:17.306625  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:22:17.306682  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:22:17.342576  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:17.342595  588228 cri.go:89] found id: ""
	I1217 21:22:17.342603  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:22:17.342658  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:17.346999  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:22:17.347118  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:22:17.391119  588228 cri.go:89] found id: ""
	I1217 21:22:17.391140  588228 logs.go:282] 0 containers: []
	W1217 21:22:17.391148  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:22:17.391155  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:22:17.391217  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:22:17.420828  588228 cri.go:89] found id: ""
	I1217 21:22:17.420854  588228 logs.go:282] 0 containers: []
	W1217 21:22:17.420863  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:22:17.420905  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:22:17.420924  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:22:17.435901  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:22:17.435929  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:17.471550  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:22:17.471584  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:17.543082  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:22:17.543115  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:17.627872  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:22:17.627904  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:17.658832  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:22:17.658871  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:22:17.694618  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:22:17.694645  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:22:17.757945  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:22:17.757981  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:22:17.831174  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:22:17.831248  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:22:17.831285  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:20.387747  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:22:20.397901  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:22:20.397971  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:22:20.425502  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:20.425523  588228 cri.go:89] found id: ""
	I1217 21:22:20.425531  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:22:20.425590  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:20.429326  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:22:20.429404  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:22:20.454313  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:20.454336  588228 cri.go:89] found id: ""
	I1217 21:22:20.454344  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:22:20.454468  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:20.458107  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:22:20.458176  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:22:20.482677  588228 cri.go:89] found id: ""
	I1217 21:22:20.482703  588228 logs.go:282] 0 containers: []
	W1217 21:22:20.482712  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:22:20.482718  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:22:20.482775  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:22:20.510700  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:20.510724  588228 cri.go:89] found id: ""
	I1217 21:22:20.510733  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:22:20.510802  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:20.515226  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:22:20.515325  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:22:20.542283  588228 cri.go:89] found id: ""
	I1217 21:22:20.542308  588228 logs.go:282] 0 containers: []
	W1217 21:22:20.542317  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:22:20.542324  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:22:20.542381  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:22:20.579730  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:20.579754  588228 cri.go:89] found id: ""
	I1217 21:22:20.579762  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:22:20.579834  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:20.583366  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:22:20.583435  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:22:20.608957  588228 cri.go:89] found id: ""
	I1217 21:22:20.608980  588228 logs.go:282] 0 containers: []
	W1217 21:22:20.608989  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:22:20.608996  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:22:20.609061  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:22:20.633865  588228 cri.go:89] found id: ""
	I1217 21:22:20.633935  588228 logs.go:282] 0 containers: []
	W1217 21:22:20.633957  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:22:20.633988  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:22:20.634025  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:22:20.692558  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:22:20.692594  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:22:20.757529  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:22:20.757548  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:22:20.757560  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:20.793842  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:22:20.793876  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:20.823198  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:22:20.823235  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:22:20.838796  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:22:20.838825  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:20.886824  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:22:20.886860  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:20.928761  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:22:20.928793  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:20.976051  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:22:20.976083  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:22:23.512432  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:22:23.524402  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:22:23.524495  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:22:23.563962  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:23.563992  588228 cri.go:89] found id: ""
	I1217 21:22:23.564001  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:22:23.564058  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:23.567990  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:22:23.568065  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:22:23.594281  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:23.594302  588228 cri.go:89] found id: ""
	I1217 21:22:23.594310  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:22:23.594384  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:23.598257  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:22:23.598330  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:22:23.636974  588228 cri.go:89] found id: ""
	I1217 21:22:23.636998  588228 logs.go:282] 0 containers: []
	W1217 21:22:23.637016  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:22:23.637023  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:22:23.637118  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:22:23.663679  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:23.663704  588228 cri.go:89] found id: ""
	I1217 21:22:23.663712  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:22:23.663787  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:23.667685  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:22:23.667800  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:22:23.693289  588228 cri.go:89] found id: ""
	I1217 21:22:23.693357  588228 logs.go:282] 0 containers: []
	W1217 21:22:23.693380  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:22:23.693397  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:22:23.693472  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:22:23.718989  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:23.719013  588228 cri.go:89] found id: ""
	I1217 21:22:23.719022  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:22:23.719090  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:23.723193  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:22:23.723311  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:22:23.749119  588228 cri.go:89] found id: ""
	I1217 21:22:23.749155  588228 logs.go:282] 0 containers: []
	W1217 21:22:23.749165  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:22:23.749171  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:22:23.749249  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:22:23.774851  588228 cri.go:89] found id: ""
	I1217 21:22:23.774877  588228 logs.go:282] 0 containers: []
	W1217 21:22:23.774895  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:22:23.774910  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:22:23.774923  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:22:23.843480  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:22:23.843502  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:22:23.843516  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:23.877738  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:22:23.877777  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:23.909756  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:22:23.909785  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:23.963238  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:22:23.963271  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:22:23.990657  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:22:23.990685  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:22:24.007891  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:22:24.007932  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:24.049666  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:22:24.049700  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:24.080580  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:22:24.080613  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:22:26.644558  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:22:26.654693  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:22:26.654766  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:22:26.679809  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:26.679828  588228 cri.go:89] found id: ""
	I1217 21:22:26.679836  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:22:26.679891  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:26.683637  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:22:26.683714  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:22:26.708500  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:26.708520  588228 cri.go:89] found id: ""
	I1217 21:22:26.708528  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:22:26.708582  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:26.712560  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:22:26.712637  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:22:26.737521  588228 cri.go:89] found id: ""
	I1217 21:22:26.737545  588228 logs.go:282] 0 containers: []
	W1217 21:22:26.737554  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:22:26.737560  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:22:26.737619  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:22:26.765320  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:26.765344  588228 cri.go:89] found id: ""
	I1217 21:22:26.765352  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:22:26.765408  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:26.769099  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:22:26.769169  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:22:26.794364  588228 cri.go:89] found id: ""
	I1217 21:22:26.794388  588228 logs.go:282] 0 containers: []
	W1217 21:22:26.794396  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:22:26.794403  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:22:26.794459  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:22:26.819409  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:26.819433  588228 cri.go:89] found id: ""
	I1217 21:22:26.819441  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:22:26.819500  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:26.823680  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:22:26.823774  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:22:26.853055  588228 cri.go:89] found id: ""
	I1217 21:22:26.853079  588228 logs.go:282] 0 containers: []
	W1217 21:22:26.853088  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:22:26.853094  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:22:26.853156  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:22:26.881202  588228 cri.go:89] found id: ""
	I1217 21:22:26.881224  588228 logs.go:282] 0 containers: []
	W1217 21:22:26.881234  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:22:26.881248  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:22:26.881262  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:26.914556  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:22:26.914589  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:26.954290  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:22:26.954333  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:26.985525  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:22:26.985566  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:22:27.000899  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:22:27.000927  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:22:27.068390  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:22:27.068413  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:22:27.068429  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:27.106143  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:22:27.106174  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:27.138577  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:22:27.138611  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:22:27.169480  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:22:27.169509  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:22:29.728816  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:22:29.740720  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:22:29.740817  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:22:29.782264  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:29.782337  588228 cri.go:89] found id: ""
	I1217 21:22:29.782348  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:22:29.782432  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:29.787341  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:22:29.787465  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:22:29.813529  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:29.813597  588228 cri.go:89] found id: ""
	I1217 21:22:29.813619  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:22:29.813701  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:29.817445  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:22:29.817543  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:22:29.844377  588228 cri.go:89] found id: ""
	I1217 21:22:29.844401  588228 logs.go:282] 0 containers: []
	W1217 21:22:29.844411  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:22:29.844449  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:22:29.844535  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:22:29.873561  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:29.873584  588228 cri.go:89] found id: ""
	I1217 21:22:29.873592  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:22:29.873668  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:29.877397  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:22:29.877496  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:22:29.907006  588228 cri.go:89] found id: ""
	I1217 21:22:29.907028  588228 logs.go:282] 0 containers: []
	W1217 21:22:29.907037  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:22:29.907043  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:22:29.907127  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:22:29.933778  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:29.933841  588228 cri.go:89] found id: ""
	I1217 21:22:29.933871  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:22:29.933934  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:29.937903  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:22:29.938021  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:22:29.963175  588228 cri.go:89] found id: ""
	I1217 21:22:29.963205  588228 logs.go:282] 0 containers: []
	W1217 21:22:29.963215  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:22:29.963222  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:22:29.963299  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:22:29.989991  588228 cri.go:89] found id: ""
	I1217 21:22:29.990068  588228 logs.go:282] 0 containers: []
	W1217 21:22:29.990083  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:22:29.990099  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:22:29.990111  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:22:30.007852  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:22:30.007886  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:30.079971  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:22:30.080068  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:30.133602  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:22:30.133643  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:30.164124  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:22:30.164181  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:22:30.236188  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:22:30.236222  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:22:30.236238  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:30.278895  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:22:30.278929  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:30.313420  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:22:30.313453  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:22:30.357068  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:22:30.357095  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:22:32.918816  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:22:32.929671  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:22:32.929744  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:22:32.957450  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:32.957475  588228 cri.go:89] found id: ""
	I1217 21:22:32.957483  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:22:32.957541  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:32.961350  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:22:32.961423  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:22:32.986240  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:32.986261  588228 cri.go:89] found id: ""
	I1217 21:22:32.986269  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:22:32.986322  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:32.990012  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:22:32.990083  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:22:33.018710  588228 cri.go:89] found id: ""
	I1217 21:22:33.018733  588228 logs.go:282] 0 containers: []
	W1217 21:22:33.018743  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:22:33.018749  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:22:33.018810  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:22:33.043948  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:33.043969  588228 cri.go:89] found id: ""
	I1217 21:22:33.043978  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:22:33.044044  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:33.047876  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:22:33.047949  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:22:33.073905  588228 cri.go:89] found id: ""
	I1217 21:22:33.073928  588228 logs.go:282] 0 containers: []
	W1217 21:22:33.073936  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:22:33.073942  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:22:33.074001  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:22:33.098215  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:33.098281  588228 cri.go:89] found id: ""
	I1217 21:22:33.098301  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:22:33.098382  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:33.102452  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:22:33.102526  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:22:33.127043  588228 cri.go:89] found id: ""
	I1217 21:22:33.127070  588228 logs.go:282] 0 containers: []
	W1217 21:22:33.127080  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:22:33.127086  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:22:33.127149  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:22:33.152362  588228 cri.go:89] found id: ""
	I1217 21:22:33.152386  588228 logs.go:282] 0 containers: []
	W1217 21:22:33.152396  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:22:33.152409  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:22:33.152422  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:33.185877  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:22:33.185948  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:33.222742  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:22:33.222780  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:33.252883  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:22:33.252920  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:22:33.297923  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:22:33.297953  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:22:33.364647  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:22:33.364683  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:22:33.380366  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:22:33.380393  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:33.416479  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:22:33.416509  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:33.452684  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:22:33.452717  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:22:33.517753  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:22:36.018037  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:22:36.028782  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:22:36.028853  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:22:36.058771  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:36.058815  588228 cri.go:89] found id: ""
	I1217 21:22:36.058824  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:22:36.058894  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:36.062632  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:22:36.062706  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:22:36.089017  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:36.089089  588228 cri.go:89] found id: ""
	I1217 21:22:36.089112  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:22:36.089180  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:36.093140  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:22:36.093240  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:22:36.121937  588228 cri.go:89] found id: ""
	I1217 21:22:36.121959  588228 logs.go:282] 0 containers: []
	W1217 21:22:36.121968  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:22:36.121974  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:22:36.122034  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:22:36.148504  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:36.148523  588228 cri.go:89] found id: ""
	I1217 21:22:36.148531  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:22:36.148595  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:36.152136  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:22:36.152219  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:22:36.178312  588228 cri.go:89] found id: ""
	I1217 21:22:36.178388  588228 logs.go:282] 0 containers: []
	W1217 21:22:36.178411  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:22:36.178430  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:22:36.178511  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:22:36.206775  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:36.206798  588228 cri.go:89] found id: ""
	I1217 21:22:36.206806  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:22:36.206863  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:36.210498  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:22:36.210614  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:22:36.237461  588228 cri.go:89] found id: ""
	I1217 21:22:36.237496  588228 logs.go:282] 0 containers: []
	W1217 21:22:36.237506  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:22:36.237537  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:22:36.237614  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:22:36.274730  588228 cri.go:89] found id: ""
	I1217 21:22:36.274795  588228 logs.go:282] 0 containers: []
	W1217 21:22:36.274817  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:22:36.274841  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:22:36.274877  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:22:36.343208  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:22:36.343246  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:36.377387  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:22:36.377421  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:36.414425  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:22:36.414461  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:36.444060  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:22:36.444097  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:22:36.484238  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:22:36.484333  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:22:36.499740  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:22:36.499811  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:22:36.568392  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:22:36.568465  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:22:36.568485  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:36.603184  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:22:36.603213  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:39.135722  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:22:39.145786  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:22:39.145858  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:22:39.171361  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:39.171382  588228 cri.go:89] found id: ""
	I1217 21:22:39.171390  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:22:39.171448  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:39.175272  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:22:39.175342  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:22:39.199167  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:39.199189  588228 cri.go:89] found id: ""
	I1217 21:22:39.199197  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:22:39.199253  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:39.203026  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:22:39.203093  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:22:39.229670  588228 cri.go:89] found id: ""
	I1217 21:22:39.229748  588228 logs.go:282] 0 containers: []
	W1217 21:22:39.229763  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:22:39.229771  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:22:39.229833  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:22:39.256775  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:39.256797  588228 cri.go:89] found id: ""
	I1217 21:22:39.256805  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:22:39.256859  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:39.261222  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:22:39.261292  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:22:39.287694  588228 cri.go:89] found id: ""
	I1217 21:22:39.287715  588228 logs.go:282] 0 containers: []
	W1217 21:22:39.287725  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:22:39.287731  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:22:39.287789  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:22:39.319130  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:39.319149  588228 cri.go:89] found id: ""
	I1217 21:22:39.319156  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:22:39.319211  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:39.323171  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:22:39.323237  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:22:39.348506  588228 cri.go:89] found id: ""
	I1217 21:22:39.348529  588228 logs.go:282] 0 containers: []
	W1217 21:22:39.348539  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:22:39.348545  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:22:39.348604  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:22:39.373732  588228 cri.go:89] found id: ""
	I1217 21:22:39.373757  588228 logs.go:282] 0 containers: []
	W1217 21:22:39.373765  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:22:39.373779  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:22:39.373790  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:22:39.431560  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:22:39.431602  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:22:39.446818  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:22:39.446847  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:22:39.517029  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:22:39.517050  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:22:39.517063  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:39.561494  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:22:39.561524  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:39.599229  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:22:39.599260  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:39.634972  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:22:39.635011  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:39.665139  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:22:39.665171  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:39.718654  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:22:39.718691  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:22:42.247482  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:22:42.260364  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:22:42.260458  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:22:42.290886  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:42.290960  588228 cri.go:89] found id: ""
	I1217 21:22:42.290981  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:22:42.291069  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:42.295469  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:22:42.295540  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:22:42.327541  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:42.327569  588228 cri.go:89] found id: ""
	I1217 21:22:42.327577  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:22:42.327633  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:42.331628  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:22:42.331760  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:22:42.357945  588228 cri.go:89] found id: ""
	I1217 21:22:42.357971  588228 logs.go:282] 0 containers: []
	W1217 21:22:42.357980  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:22:42.357987  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:22:42.358044  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:22:42.384157  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:42.384178  588228 cri.go:89] found id: ""
	I1217 21:22:42.384186  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:22:42.384244  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:42.387976  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:22:42.388048  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:22:42.412847  588228 cri.go:89] found id: ""
	I1217 21:22:42.412871  588228 logs.go:282] 0 containers: []
	W1217 21:22:42.412880  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:22:42.412886  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:22:42.412951  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:22:42.438252  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:42.438275  588228 cri.go:89] found id: ""
	I1217 21:22:42.438284  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:22:42.438363  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:42.442138  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:22:42.442211  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:22:42.467934  588228 cri.go:89] found id: ""
	I1217 21:22:42.467957  588228 logs.go:282] 0 containers: []
	W1217 21:22:42.467965  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:22:42.467972  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:22:42.468039  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:22:42.494177  588228 cri.go:89] found id: ""
	I1217 21:22:42.494198  588228 logs.go:282] 0 containers: []
	W1217 21:22:42.494207  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:22:42.494220  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:22:42.494231  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:22:42.560718  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:22:42.560738  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:22:42.560753  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:42.593225  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:22:42.593282  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:42.633384  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:22:42.633417  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:42.662770  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:22:42.662812  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:42.711396  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:22:42.711430  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:42.748675  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:22:42.748707  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:22:42.776103  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:22:42.776137  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:22:42.833477  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:22:42.833511  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:22:45.348363  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:22:45.360027  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:22:45.360226  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:22:45.387014  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:45.387035  588228 cri.go:89] found id: ""
	I1217 21:22:45.387043  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:22:45.387100  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:45.390737  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:22:45.390805  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:22:45.416728  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:45.416749  588228 cri.go:89] found id: ""
	I1217 21:22:45.416757  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:22:45.416814  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:45.420404  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:22:45.420478  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:22:45.446484  588228 cri.go:89] found id: ""
	I1217 21:22:45.446508  588228 logs.go:282] 0 containers: []
	W1217 21:22:45.446518  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:22:45.446524  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:22:45.446584  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:22:45.472943  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:45.473018  588228 cri.go:89] found id: ""
	I1217 21:22:45.473040  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:22:45.473125  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:45.476816  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:22:45.476913  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:22:45.501253  588228 cri.go:89] found id: ""
	I1217 21:22:45.501274  588228 logs.go:282] 0 containers: []
	W1217 21:22:45.501283  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:22:45.501289  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:22:45.501349  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:22:45.531828  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:45.531847  588228 cri.go:89] found id: ""
	I1217 21:22:45.531855  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:22:45.531912  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:45.535662  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:22:45.535737  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:22:45.568858  588228 cri.go:89] found id: ""
	I1217 21:22:45.568880  588228 logs.go:282] 0 containers: []
	W1217 21:22:45.568888  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:22:45.568895  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:22:45.568952  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:22:45.594388  588228 cri.go:89] found id: ""
	I1217 21:22:45.594462  588228 logs.go:282] 0 containers: []
	W1217 21:22:45.594485  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:22:45.594507  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:22:45.594532  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:45.626709  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:22:45.626740  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:45.671213  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:22:45.671249  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:45.702517  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:22:45.702551  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:22:45.764495  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:22:45.764530  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:22:45.779579  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:22:45.779611  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:22:45.848978  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:22:45.848997  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:22:45.849010  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:45.888090  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:22:45.888118  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:22:45.925847  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:22:45.925877  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:48.465411  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:22:48.475835  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:22:48.475908  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:22:48.501809  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:48.501837  588228 cri.go:89] found id: ""
	I1217 21:22:48.501845  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:22:48.501906  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:48.505755  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:22:48.505829  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:22:48.531197  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:48.531220  588228 cri.go:89] found id: ""
	I1217 21:22:48.531228  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:22:48.531285  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:48.535281  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:22:48.535356  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:22:48.568562  588228 cri.go:89] found id: ""
	I1217 21:22:48.568585  588228 logs.go:282] 0 containers: []
	W1217 21:22:48.568594  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:22:48.568600  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:22:48.568661  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:22:48.594000  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:48.594063  588228 cri.go:89] found id: ""
	I1217 21:22:48.594085  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:22:48.594147  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:48.597979  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:22:48.598077  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:22:48.626852  588228 cri.go:89] found id: ""
	I1217 21:22:48.626876  588228 logs.go:282] 0 containers: []
	W1217 21:22:48.626884  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:22:48.626891  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:22:48.626952  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:22:48.651787  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:48.651808  588228 cri.go:89] found id: ""
	I1217 21:22:48.651816  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:22:48.651872  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:48.655508  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:22:48.655579  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:22:48.679144  588228 cri.go:89] found id: ""
	I1217 21:22:48.679167  588228 logs.go:282] 0 containers: []
	W1217 21:22:48.679176  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:22:48.679183  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:22:48.679260  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:22:48.703458  588228 cri.go:89] found id: ""
	I1217 21:22:48.703480  588228 logs.go:282] 0 containers: []
	W1217 21:22:48.703489  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:22:48.703521  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:22:48.703538  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:22:48.718045  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:22:48.718073  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:22:48.785398  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:22:48.785419  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:22:48.785432  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:48.825978  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:22:48.826006  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:48.857962  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:22:48.857990  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:48.893853  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:22:48.893886  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:48.922855  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:22:48.922886  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:22:48.952289  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:22:48.952320  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:48.988591  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:22:48.988625  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:22:51.553169  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:22:51.564078  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:22:51.564156  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:22:51.594229  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:51.594248  588228 cri.go:89] found id: ""
	I1217 21:22:51.594255  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:22:51.594311  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:51.598111  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:22:51.598181  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:22:51.624937  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:51.624960  588228 cri.go:89] found id: ""
	I1217 21:22:51.624967  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:22:51.625023  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:51.629033  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:22:51.629119  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:22:51.655224  588228 cri.go:89] found id: ""
	I1217 21:22:51.655254  588228 logs.go:282] 0 containers: []
	W1217 21:22:51.655263  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:22:51.655270  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:22:51.655328  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:22:51.685461  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:51.685484  588228 cri.go:89] found id: ""
	I1217 21:22:51.685493  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:22:51.685563  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:51.689348  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:22:51.689444  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:22:51.714088  588228 cri.go:89] found id: ""
	I1217 21:22:51.714113  588228 logs.go:282] 0 containers: []
	W1217 21:22:51.714125  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:22:51.714131  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:22:51.714193  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:22:51.740780  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:51.740800  588228 cri.go:89] found id: ""
	I1217 21:22:51.740809  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:22:51.740893  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:51.744486  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:22:51.744560  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:22:51.769524  588228 cri.go:89] found id: ""
	I1217 21:22:51.769548  588228 logs.go:282] 0 containers: []
	W1217 21:22:51.769557  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:22:51.769563  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:22:51.769628  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:22:51.793908  588228 cri.go:89] found id: ""
	I1217 21:22:51.793931  588228 logs.go:282] 0 containers: []
	W1217 21:22:51.793940  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:22:51.793956  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:22:51.793993  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:22:51.863222  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:22:51.863243  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:22:51.863256  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:51.896149  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:22:51.896180  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:51.930076  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:22:51.930106  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:51.963462  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:22:51.963492  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:51.999899  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:22:51.999933  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:52.035902  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:22:52.035938  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:22:52.102867  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:22:52.102906  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:22:52.118699  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:22:52.118734  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:22:54.649031  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:22:54.663377  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:22:54.663449  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:22:54.694178  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:54.694201  588228 cri.go:89] found id: ""
	I1217 21:22:54.694209  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:22:54.694263  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:54.699364  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:22:54.699431  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:22:54.732980  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:54.733003  588228 cri.go:89] found id: ""
	I1217 21:22:54.733035  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:22:54.733111  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:54.737381  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:22:54.737452  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:22:54.774166  588228 cri.go:89] found id: ""
	I1217 21:22:54.774190  588228 logs.go:282] 0 containers: []
	W1217 21:22:54.774199  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:22:54.774206  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:22:54.774266  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:22:54.804630  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:54.804654  588228 cri.go:89] found id: ""
	I1217 21:22:54.804669  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:22:54.804731  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:54.811257  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:22:54.811330  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:22:54.858971  588228 cri.go:89] found id: ""
	I1217 21:22:54.859013  588228 logs.go:282] 0 containers: []
	W1217 21:22:54.859023  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:22:54.859029  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:22:54.859095  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:22:54.892326  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:54.892351  588228 cri.go:89] found id: ""
	I1217 21:22:54.892359  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:22:54.892417  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:54.896295  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:22:54.896368  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:22:54.925325  588228 cri.go:89] found id: ""
	I1217 21:22:54.925347  588228 logs.go:282] 0 containers: []
	W1217 21:22:54.925356  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:22:54.925362  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:22:54.925420  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:22:54.956568  588228 cri.go:89] found id: ""
	I1217 21:22:54.956591  588228 logs.go:282] 0 containers: []
	W1217 21:22:54.956599  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:22:54.956613  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:22:54.956640  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:22:55.016756  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:22:55.016817  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:55.058506  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:22:55.058539  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:55.093481  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:22:55.093514  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:55.127554  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:22:55.127587  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:55.169580  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:22:55.169611  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:55.201030  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:22:55.201065  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:22:55.232709  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:22:55.232736  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:22:55.248456  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:22:55.248482  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:22:55.311783  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:22:57.812355  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:22:57.823634  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:22:57.823700  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:22:57.865632  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:57.865651  588228 cri.go:89] found id: ""
	I1217 21:22:57.865659  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:22:57.865714  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:57.869484  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:22:57.869552  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:22:57.900779  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:22:57.900798  588228 cri.go:89] found id: ""
	I1217 21:22:57.900806  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:22:57.900863  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:57.905515  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:22:57.905641  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:22:57.934867  588228 cri.go:89] found id: ""
	I1217 21:22:57.934890  588228 logs.go:282] 0 containers: []
	W1217 21:22:57.934899  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:22:57.934905  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:22:57.934961  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:22:57.963991  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:57.964011  588228 cri.go:89] found id: ""
	I1217 21:22:57.964018  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:22:57.964080  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:57.968739  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:22:57.968811  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:22:58.018827  588228 cri.go:89] found id: ""
	I1217 21:22:58.018848  588228 logs.go:282] 0 containers: []
	W1217 21:22:58.018858  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:22:58.018865  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:22:58.018927  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:22:58.077019  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:58.077040  588228 cri.go:89] found id: ""
	I1217 21:22:58.077048  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:22:58.077112  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:22:58.082105  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:22:58.082177  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:22:58.134733  588228 cri.go:89] found id: ""
	I1217 21:22:58.134759  588228 logs.go:282] 0 containers: []
	W1217 21:22:58.134768  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:22:58.134774  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:22:58.134833  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:22:58.164385  588228 cri.go:89] found id: ""
	I1217 21:22:58.164410  588228 logs.go:282] 0 containers: []
	W1217 21:22:58.164419  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:22:58.164432  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:22:58.164444  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:22:58.212988  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:22:58.213028  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:22:58.248975  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:22:58.249016  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:22:58.304621  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:22:58.304646  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:22:58.374273  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:22:58.374313  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:22:58.418083  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:22:58.418131  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:22:58.472445  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:22:58.472483  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:22:58.490807  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:22:58.490889  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:22:58.566999  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:22:58.567019  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:22:58.567032  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:23:01.102050  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:23:01.112969  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:23:01.113035  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:23:01.151782  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:23:01.151803  588228 cri.go:89] found id: ""
	I1217 21:23:01.151811  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:23:01.151877  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:23:01.156631  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:23:01.156705  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:23:01.186889  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:23:01.186909  588228 cri.go:89] found id: ""
	I1217 21:23:01.186918  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:23:01.186980  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:23:01.191530  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:23:01.191655  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:23:01.220142  588228 cri.go:89] found id: ""
	I1217 21:23:01.220178  588228 logs.go:282] 0 containers: []
	W1217 21:23:01.220205  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:23:01.220220  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:23:01.220341  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:23:01.259050  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:23:01.259069  588228 cri.go:89] found id: ""
	I1217 21:23:01.259076  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:23:01.259147  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:23:01.263882  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:23:01.263950  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:23:01.290483  588228 cri.go:89] found id: ""
	I1217 21:23:01.290506  588228 logs.go:282] 0 containers: []
	W1217 21:23:01.290516  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:23:01.290523  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:23:01.290583  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:23:01.321373  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:23:01.321398  588228 cri.go:89] found id: ""
	I1217 21:23:01.321407  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:23:01.321463  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:23:01.326114  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:23:01.326184  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:23:01.362534  588228 cri.go:89] found id: ""
	I1217 21:23:01.362557  588228 logs.go:282] 0 containers: []
	W1217 21:23:01.362565  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:23:01.362571  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:23:01.362668  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:23:01.398476  588228 cri.go:89] found id: ""
	I1217 21:23:01.398541  588228 logs.go:282] 0 containers: []
	W1217 21:23:01.398566  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:23:01.398592  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:23:01.398619  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:23:01.481262  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:23:01.481352  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:23:01.498826  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:23:01.498889  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:23:01.547322  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:23:01.547398  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:23:01.620211  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:23:01.620363  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:23:01.664516  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:23:01.664601  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:23:01.701765  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:23:01.701846  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:23:01.800756  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:23:01.800774  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:23:01.800786  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:23:01.887521  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:23:01.888360  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:23:04.438310  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:23:04.448632  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:23:04.448705  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:23:04.474908  588228 cri.go:89] found id: "f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:23:04.474931  588228 cri.go:89] found id: ""
	I1217 21:23:04.474939  588228 logs.go:282] 1 containers: [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501]
	I1217 21:23:04.474995  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:23:04.478680  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:23:04.478748  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:23:04.509320  588228 cri.go:89] found id: "0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:23:04.509347  588228 cri.go:89] found id: ""
	I1217 21:23:04.509362  588228 logs.go:282] 1 containers: [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df]
	I1217 21:23:04.509421  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:23:04.513403  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:23:04.513476  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:23:04.539371  588228 cri.go:89] found id: ""
	I1217 21:23:04.539394  588228 logs.go:282] 0 containers: []
	W1217 21:23:04.539403  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:23:04.539409  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:23:04.539463  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:23:04.571409  588228 cri.go:89] found id: "f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:23:04.571430  588228 cri.go:89] found id: ""
	I1217 21:23:04.571438  588228 logs.go:282] 1 containers: [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d]
	I1217 21:23:04.571498  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:23:04.575401  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:23:04.575474  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:23:04.601158  588228 cri.go:89] found id: ""
	I1217 21:23:04.601181  588228 logs.go:282] 0 containers: []
	W1217 21:23:04.601190  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:23:04.601196  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:23:04.601258  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:23:04.627285  588228 cri.go:89] found id: "eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:23:04.627304  588228 cri.go:89] found id: ""
	I1217 21:23:04.627312  588228 logs.go:282] 1 containers: [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd]
	I1217 21:23:04.627368  588228 ssh_runner.go:195] Run: which crictl
	I1217 21:23:04.631180  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:23:04.631301  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:23:04.658168  588228 cri.go:89] found id: ""
	I1217 21:23:04.658191  588228 logs.go:282] 0 containers: []
	W1217 21:23:04.658201  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:23:04.658207  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:23:04.658305  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:23:04.703002  588228 cri.go:89] found id: ""
	I1217 21:23:04.703024  588228 logs.go:282] 0 containers: []
	W1217 21:23:04.703032  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:23:04.703046  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:23:04.703058  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:23:04.839858  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:23:04.839883  588228 logs.go:123] Gathering logs for kube-apiserver [f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501] ...
	I1217 21:23:04.839896  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501"
	I1217 21:23:04.890688  588228 logs.go:123] Gathering logs for kube-controller-manager [eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd] ...
	I1217 21:23:04.890720  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd"
	I1217 21:23:04.949335  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:23:04.949370  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1217 21:23:05.015942  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:23:05.015972  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:23:05.036114  588228 logs.go:123] Gathering logs for etcd [0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df] ...
	I1217 21:23:05.036146  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df"
	I1217 21:23:05.070718  588228 logs.go:123] Gathering logs for kube-scheduler [f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d] ...
	I1217 21:23:05.070751  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d"
	I1217 21:23:05.120222  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:23:05.120315  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:23:05.155037  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:23:05.155138  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:23:07.726025  588228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:23:07.737708  588228 kubeadm.go:602] duration metric: took 4m3.573198734s to restartPrimaryControlPlane
	W1217 21:23:07.737777  588228 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1217 21:23:07.737836  588228 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1217 21:23:08.244229  588228 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1217 21:23:08.258402  588228 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1217 21:23:08.267223  588228 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1217 21:23:08.267282  588228 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 21:23:08.277531  588228 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1217 21:23:08.277550  588228 kubeadm.go:158] found existing configuration files:
	
	I1217 21:23:08.277603  588228 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1217 21:23:08.285928  588228 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1217 21:23:08.285992  588228 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1217 21:23:08.293867  588228 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1217 21:23:08.302510  588228 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1217 21:23:08.302576  588228 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 21:23:08.310305  588228 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1217 21:23:08.319833  588228 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1217 21:23:08.319899  588228 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 21:23:08.328760  588228 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1217 21:23:08.339239  588228 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1217 21:23:08.339304  588228 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 21:23:08.350560  588228 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1217 21:23:08.408422  588228 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1217 21:23:08.409247  588228 kubeadm.go:319] [preflight] Running pre-flight checks
	I1217 21:23:08.497471  588228 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1217 21:23:08.497549  588228 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1217 21:23:08.497592  588228 kubeadm.go:319] OS: Linux
	I1217 21:23:08.497642  588228 kubeadm.go:319] CGROUPS_CPU: enabled
	I1217 21:23:08.497694  588228 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1217 21:23:08.497746  588228 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1217 21:23:08.497801  588228 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1217 21:23:08.497855  588228 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1217 21:23:08.497907  588228 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1217 21:23:08.497956  588228 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1217 21:23:08.498008  588228 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1217 21:23:08.498057  588228 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1217 21:23:08.616690  588228 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1217 21:23:08.616822  588228 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1217 21:23:08.616919  588228 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1217 21:23:18.585323  588228 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1217 21:23:18.588345  588228 out.go:252]   - Generating certificates and keys ...
	I1217 21:23:18.588442  588228 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1217 21:23:18.588519  588228 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1217 21:23:18.588602  588228 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1217 21:23:18.588667  588228 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1217 21:23:18.588739  588228 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1217 21:23:18.588797  588228 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1217 21:23:18.588864  588228 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1217 21:23:18.588928  588228 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1217 21:23:18.589181  588228 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1217 21:23:18.589267  588228 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1217 21:23:18.589543  588228 kubeadm.go:319] [certs] Using the existing "sa" key
	I1217 21:23:18.589609  588228 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1217 21:23:19.032583  588228 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1217 21:23:19.598971  588228 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1217 21:23:20.573088  588228 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1217 21:23:20.661620  588228 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1217 21:23:21.003253  588228 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1217 21:23:21.004433  588228 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1217 21:23:21.007076  588228 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1217 21:23:21.010424  588228 out.go:252]   - Booting up control plane ...
	I1217 21:23:21.010534  588228 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1217 21:23:21.010614  588228 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1217 21:23:21.010686  588228 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1217 21:23:21.031881  588228 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1217 21:23:21.031991  588228 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1217 21:23:21.041542  588228 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1217 21:23:21.043433  588228 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1217 21:23:21.043757  588228 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1217 21:23:21.182816  588228 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1217 21:23:21.182980  588228 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1217 21:27:21.183273  588228 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000517642s
	I1217 21:27:21.183305  588228 kubeadm.go:319] 
	I1217 21:27:21.183363  588228 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1217 21:27:21.183396  588228 kubeadm.go:319] 	- The kubelet is not running
	I1217 21:27:21.183501  588228 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1217 21:27:21.183507  588228 kubeadm.go:319] 
	I1217 21:27:21.183611  588228 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1217 21:27:21.183643  588228 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1217 21:27:21.183674  588228 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1217 21:27:21.183678  588228 kubeadm.go:319] 
	I1217 21:27:21.187839  588228 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1217 21:27:21.188310  588228 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1217 21:27:21.188426  588228 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1217 21:27:21.188662  588228 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1217 21:27:21.188668  588228 kubeadm.go:319] 
	I1217 21:27:21.188737  588228 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1217 21:27:21.188860  588228 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000517642s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000517642s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1217 21:27:21.188938  588228 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1217 21:27:21.609492  588228 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1217 21:27:21.623398  588228 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1217 21:27:21.623461  588228 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 21:27:21.631607  588228 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1217 21:27:21.631631  588228 kubeadm.go:158] found existing configuration files:
	
	I1217 21:27:21.631684  588228 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1217 21:27:21.639847  588228 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1217 21:27:21.639906  588228 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1217 21:27:21.647397  588228 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1217 21:27:21.656619  588228 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1217 21:27:21.656689  588228 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 21:27:21.664671  588228 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1217 21:27:21.672587  588228 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1217 21:27:21.672650  588228 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 21:27:21.680550  588228 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1217 21:27:21.688385  588228 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1217 21:27:21.688456  588228 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 21:27:21.695625  588228 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1217 21:27:21.821883  588228 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1217 21:27:21.822322  588228 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1217 21:27:21.899322  588228 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1217 21:31:23.155243  588228 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1217 21:31:23.155281  588228 kubeadm.go:319] 
	I1217 21:31:23.155353  588228 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1217 21:31:23.156267  588228 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1217 21:31:23.156329  588228 kubeadm.go:319] [preflight] Running pre-flight checks
	I1217 21:31:23.156453  588228 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1217 21:31:23.156522  588228 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1217 21:31:23.156557  588228 kubeadm.go:319] OS: Linux
	I1217 21:31:23.156603  588228 kubeadm.go:319] CGROUPS_CPU: enabled
	I1217 21:31:23.156652  588228 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1217 21:31:23.156699  588228 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1217 21:31:23.156747  588228 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1217 21:31:23.156795  588228 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1217 21:31:23.156844  588228 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1217 21:31:23.156889  588228 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1217 21:31:23.156937  588228 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1217 21:31:23.156983  588228 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1217 21:31:23.157055  588228 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1217 21:31:23.157150  588228 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1217 21:31:23.157239  588228 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1217 21:31:23.157301  588228 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1217 21:31:23.160332  588228 out.go:252]   - Generating certificates and keys ...
	I1217 21:31:23.160435  588228 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1217 21:31:23.160513  588228 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1217 21:31:23.160596  588228 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1217 21:31:23.160663  588228 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1217 21:31:23.160763  588228 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1217 21:31:23.160838  588228 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1217 21:31:23.160911  588228 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1217 21:31:23.160983  588228 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1217 21:31:23.161063  588228 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1217 21:31:23.161157  588228 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1217 21:31:23.161206  588228 kubeadm.go:319] [certs] Using the existing "sa" key
	I1217 21:31:23.161274  588228 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1217 21:31:23.161329  588228 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1217 21:31:23.161392  588228 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1217 21:31:23.161449  588228 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1217 21:31:23.161516  588228 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1217 21:31:23.161574  588228 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1217 21:31:23.161662  588228 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1217 21:31:23.161730  588228 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1217 21:31:23.164765  588228 out.go:252]   - Booting up control plane ...
	I1217 21:31:23.164890  588228 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1217 21:31:23.164975  588228 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1217 21:31:23.165047  588228 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1217 21:31:23.165154  588228 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1217 21:31:23.165254  588228 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1217 21:31:23.165361  588228 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1217 21:31:23.165447  588228 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1217 21:31:23.165499  588228 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1217 21:31:23.165630  588228 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1217 21:31:23.165739  588228 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1217 21:31:23.165807  588228 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000869511s
	I1217 21:31:23.165815  588228 kubeadm.go:319] 
	I1217 21:31:23.165872  588228 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1217 21:31:23.165908  588228 kubeadm.go:319] 	- The kubelet is not running
	I1217 21:31:23.166015  588228 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1217 21:31:23.166022  588228 kubeadm.go:319] 
	I1217 21:31:23.166126  588228 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1217 21:31:23.166161  588228 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1217 21:31:23.166194  588228 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1217 21:31:23.166264  588228 kubeadm.go:403] duration metric: took 12m19.05543739s to StartCluster
	I1217 21:31:23.166300  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:31:23.166368  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:31:23.166466  588228 kubeadm.go:319] 
	I1217 21:31:23.193416  588228 cri.go:89] found id: ""
	I1217 21:31:23.193440  588228 logs.go:282] 0 containers: []
	W1217 21:31:23.193453  588228 logs.go:284] No container was found matching "kube-apiserver"
	I1217 21:31:23.193460  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:31:23.193520  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:31:23.223314  588228 cri.go:89] found id: ""
	I1217 21:31:23.223349  588228 logs.go:282] 0 containers: []
	W1217 21:31:23.223358  588228 logs.go:284] No container was found matching "etcd"
	I1217 21:31:23.223365  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:31:23.223430  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:31:23.252042  588228 cri.go:89] found id: ""
	I1217 21:31:23.252070  588228 logs.go:282] 0 containers: []
	W1217 21:31:23.252079  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:31:23.252085  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:31:23.252147  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:31:23.281849  588228 cri.go:89] found id: ""
	I1217 21:31:23.281877  588228 logs.go:282] 0 containers: []
	W1217 21:31:23.281891  588228 logs.go:284] No container was found matching "kube-scheduler"
	I1217 21:31:23.281898  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:31:23.281958  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:31:23.312201  588228 cri.go:89] found id: ""
	I1217 21:31:23.312312  588228 logs.go:282] 0 containers: []
	W1217 21:31:23.312336  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:31:23.312354  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:31:23.312444  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:31:23.347018  588228 cri.go:89] found id: ""
	I1217 21:31:23.347042  588228 logs.go:282] 0 containers: []
	W1217 21:31:23.347050  588228 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 21:31:23.347057  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:31:23.347113  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:31:23.382404  588228 cri.go:89] found id: ""
	I1217 21:31:23.382427  588228 logs.go:282] 0 containers: []
	W1217 21:31:23.382435  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:31:23.382441  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:31:23.382498  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:31:23.415326  588228 cri.go:89] found id: ""
	I1217 21:31:23.415349  588228 logs.go:282] 0 containers: []
	W1217 21:31:23.415357  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:31:23.415367  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:31:23.415378  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:31:23.478808  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:31:23.478848  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:31:23.494510  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:31:23.494539  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:31:23.569772  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:31:23.569797  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:31:23.569810  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:31:23.614990  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:31:23.615033  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1217 21:31:23.651955  588228 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000869511s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1217 21:31:23.652001  588228 out.go:285] * 
	* 
	W1217 21:31:23.652076  588228 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000869511s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000869511s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1217 21:31:23.652096  588228 out.go:285] * 
	* 
	W1217 21:31:23.654399  588228 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1217 21:31:23.659480  588228 out.go:203] 
	W1217 21:31:23.663155  588228 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000869511s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000869511s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1217 21:31:23.663260  588228 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	* Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1217 21:31:23.663322  588228 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	* Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1217 21:31:23.666523  588228 out.go:203] 

                                                
                                                
** /stderr **
version_upgrade_test.go:245: failed to upgrade with newest k8s version. args: out/minikube-linux-arm64 start -p kubernetes-upgrade-332113 --memory=3072 --kubernetes-version=v1.35.0-rc.1 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd : exit status 109
version_upgrade_test.go:248: (dbg) Run:  kubectl --context kubernetes-upgrade-332113 version --output=json
version_upgrade_test.go:248: (dbg) Non-zero exit: kubectl --context kubernetes-upgrade-332113 version --output=json: exit status 1 (93.318172ms)

                                                
                                                
-- stdout --
	{
	  "clientVersion": {
	    "major": "1",
	    "minor": "33",
	    "gitVersion": "v1.33.2",
	    "gitCommit": "a57b6f7709f6c2722b92f07b8b4c48210a51fc40",
	    "gitTreeState": "clean",
	    "buildDate": "2025-06-17T18:41:31Z",
	    "goVersion": "go1.24.4",
	    "compiler": "gc",
	    "platform": "linux/arm64"
	  },
	  "kustomizeVersion": "v5.6.0"
	}

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.76.2:8443 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
version_upgrade_test.go:250: error running kubectl: exit status 1
panic.go:615: *** TestKubernetesUpgrade FAILED at 2025-12-17 21:31:24.361044014 +0000 UTC m=+5045.702549890
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestKubernetesUpgrade]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestKubernetesUpgrade]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect kubernetes-upgrade-332113
helpers_test.go:244: (dbg) docker inspect kubernetes-upgrade-332113:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "a9d85e15cb635ac094d60a029e07ccbf4b549c41dd3c9b3f65bec64a261884a7",
	        "Created": "2025-12-17T21:18:20.835151295Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 588355,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-17T21:18:49.781203605Z",
	            "FinishedAt": "2025-12-17T21:18:48.849465551Z"
	        },
	        "Image": "sha256:2a6398fc76fc21dc0a77ac54600c2604c101bff52e66ecf65f88ec0f1a8cff2d",
	        "ResolvConfPath": "/var/lib/docker/containers/a9d85e15cb635ac094d60a029e07ccbf4b549c41dd3c9b3f65bec64a261884a7/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/a9d85e15cb635ac094d60a029e07ccbf4b549c41dd3c9b3f65bec64a261884a7/hostname",
	        "HostsPath": "/var/lib/docker/containers/a9d85e15cb635ac094d60a029e07ccbf4b549c41dd3c9b3f65bec64a261884a7/hosts",
	        "LogPath": "/var/lib/docker/containers/a9d85e15cb635ac094d60a029e07ccbf4b549c41dd3c9b3f65bec64a261884a7/a9d85e15cb635ac094d60a029e07ccbf4b549c41dd3c9b3f65bec64a261884a7-json.log",
	        "Name": "/kubernetes-upgrade-332113",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "kubernetes-upgrade-332113:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "kubernetes-upgrade-332113",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 3221225472,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 6442450944,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "a9d85e15cb635ac094d60a029e07ccbf4b549c41dd3c9b3f65bec64a261884a7",
	                "LowerDir": "/var/lib/docker/overlay2/a2dac7d1c2a56cb9762796962b5e5981178918ac4cf2a27fd73862afc5ab1460-init/diff:/var/lib/docker/overlay2/83c8e6311894730d80a5439b5d4991744e9cfa6d0015df9caca346d57baf92e8/diff",
	                "MergedDir": "/var/lib/docker/overlay2/a2dac7d1c2a56cb9762796962b5e5981178918ac4cf2a27fd73862afc5ab1460/merged",
	                "UpperDir": "/var/lib/docker/overlay2/a2dac7d1c2a56cb9762796962b5e5981178918ac4cf2a27fd73862afc5ab1460/diff",
	                "WorkDir": "/var/lib/docker/overlay2/a2dac7d1c2a56cb9762796962b5e5981178918ac4cf2a27fd73862afc5ab1460/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "kubernetes-upgrade-332113",
	                "Source": "/var/lib/docker/volumes/kubernetes-upgrade-332113/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "kubernetes-upgrade-332113",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8443/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "kubernetes-upgrade-332113",
	                "name.minikube.sigs.k8s.io": "kubernetes-upgrade-332113",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "ecf9c26e6b3b229384534c68d1cf3af6923b8c551d321ad5f6ad4059027747a0",
	            "SandboxKey": "/var/run/docker/netns/ecf9c26e6b3b",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33413"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33414"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33417"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33415"
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33416"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "kubernetes-upgrade-332113": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.76.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "fa:d7:2a:06:b4:d8",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "acaf092fb27d6358ba37c833c503578896a72ef9b302b0df32fdbee99a5f04c5",
	                    "EndpointID": "5e1cfd36f00f6a67134b04a3ce5e27a6478ec123cc400ac6442dcfc0f493c296",
	                    "Gateway": "192.168.76.1",
	                    "IPAddress": "192.168.76.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "kubernetes-upgrade-332113",
	                        "a9d85e15cb63"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p kubernetes-upgrade-332113 -n kubernetes-upgrade-332113
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p kubernetes-upgrade-332113 -n kubernetes-upgrade-332113: exit status 2 (355.549467ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestKubernetesUpgrade FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestKubernetesUpgrade]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p kubernetes-upgrade-332113 logs -n 25
helpers_test.go:261: TestKubernetesUpgrade logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                                   ARGS                                                                                   │        PROFILE        │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh     │ -p calico-675779 sudo systemctl status kubelet --all --full --no-pager                                                                                                   │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │ 17 Dec 25 21:30 UTC │
	│ ssh     │ -p calico-675779 sudo systemctl cat kubelet --no-pager                                                                                                                   │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │ 17 Dec 25 21:30 UTC │
	│ ssh     │ -p calico-675779 sudo journalctl -xeu kubelet --all --full --no-pager                                                                                                    │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │ 17 Dec 25 21:30 UTC │
	│ ssh     │ -p calico-675779 sudo cat /etc/kubernetes/kubelet.conf                                                                                                                   │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │ 17 Dec 25 21:30 UTC │
	│ ssh     │ -p calico-675779 sudo cat /var/lib/kubelet/config.yaml                                                                                                                   │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │ 17 Dec 25 21:30 UTC │
	│ ssh     │ -p calico-675779 sudo systemctl status docker --all --full --no-pager                                                                                                    │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │                     │
	│ ssh     │ -p calico-675779 sudo systemctl cat docker --no-pager                                                                                                                    │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │ 17 Dec 25 21:30 UTC │
	│ ssh     │ -p calico-675779 sudo cat /etc/docker/daemon.json                                                                                                                        │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │                     │
	│ ssh     │ -p calico-675779 sudo docker system info                                                                                                                                 │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │                     │
	│ ssh     │ -p calico-675779 sudo systemctl status cri-docker --all --full --no-pager                                                                                                │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │                     │
	│ ssh     │ -p calico-675779 sudo systemctl cat cri-docker --no-pager                                                                                                                │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │ 17 Dec 25 21:30 UTC │
	│ ssh     │ -p calico-675779 sudo cat /etc/systemd/system/cri-docker.service.d/10-cni.conf                                                                                           │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │                     │
	│ ssh     │ -p calico-675779 sudo cat /usr/lib/systemd/system/cri-docker.service                                                                                                     │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │ 17 Dec 25 21:30 UTC │
	│ ssh     │ -p calico-675779 sudo cri-dockerd --version                                                                                                                              │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │ 17 Dec 25 21:30 UTC │
	│ ssh     │ -p calico-675779 sudo systemctl status containerd --all --full --no-pager                                                                                                │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │ 17 Dec 25 21:30 UTC │
	│ ssh     │ -p calico-675779 sudo systemctl cat containerd --no-pager                                                                                                                │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │ 17 Dec 25 21:30 UTC │
	│ ssh     │ -p calico-675779 sudo cat /lib/systemd/system/containerd.service                                                                                                         │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │ 17 Dec 25 21:30 UTC │
	│ ssh     │ -p calico-675779 sudo cat /etc/containerd/config.toml                                                                                                                    │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │ 17 Dec 25 21:30 UTC │
	│ ssh     │ -p calico-675779 sudo containerd config dump                                                                                                                             │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │ 17 Dec 25 21:30 UTC │
	│ ssh     │ -p calico-675779 sudo systemctl status crio --all --full --no-pager                                                                                                      │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │                     │
	│ ssh     │ -p calico-675779 sudo systemctl cat crio --no-pager                                                                                                                      │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │ 17 Dec 25 21:30 UTC │
	│ ssh     │ -p calico-675779 sudo find /etc/crio -type f -exec sh -c 'echo {}; cat {}' \;                                                                                            │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │ 17 Dec 25 21:30 UTC │
	│ ssh     │ -p calico-675779 sudo crio config                                                                                                                                        │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │ 17 Dec 25 21:30 UTC │
	│ delete  │ -p calico-675779                                                                                                                                                         │ calico-675779         │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │ 17 Dec 25 21:30 UTC │
	│ start   │ -p custom-flannel-675779 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=testdata/kube-flannel.yaml --driver=docker  --container-runtime=containerd │ custom-flannel-675779 │ jenkins │ v1.37.0 │ 17 Dec 25 21:30 UTC │                     │
	└─────────┴──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/17 21:30:39
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1217 21:30:39.097363  638545 out.go:360] Setting OutFile to fd 1 ...
	I1217 21:30:39.097516  638545 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 21:30:39.097528  638545 out.go:374] Setting ErrFile to fd 2...
	I1217 21:30:39.097549  638545 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 21:30:39.097931  638545 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 21:30:39.098519  638545 out.go:368] Setting JSON to false
	I1217 21:30:39.099474  638545 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":15184,"bootTime":1765991855,"procs":182,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 21:30:39.099576  638545 start.go:143] virtualization:  
	I1217 21:30:39.104497  638545 out.go:179] * [custom-flannel-675779] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1217 21:30:39.109031  638545 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 21:30:39.109144  638545 notify.go:221] Checking for updates...
	I1217 21:30:39.115851  638545 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 21:30:39.119248  638545 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 21:30:39.122478  638545 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 21:30:39.125739  638545 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 21:30:39.128857  638545 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 21:30:39.132577  638545 config.go:182] Loaded profile config "kubernetes-upgrade-332113": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 21:30:39.132688  638545 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 21:30:39.168635  638545 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 21:30:39.168832  638545 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 21:30:39.224632  638545 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 21:30:39.215169011 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 21:30:39.224738  638545 docker.go:319] overlay module found
	I1217 21:30:39.228104  638545 out.go:179] * Using the docker driver based on user configuration
	I1217 21:30:39.231081  638545 start.go:309] selected driver: docker
	I1217 21:30:39.231107  638545 start.go:927] validating driver "docker" against <nil>
	I1217 21:30:39.231123  638545 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 21:30:39.231866  638545 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 21:30:39.289120  638545 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 21:30:39.279941389 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 21:30:39.289277  638545 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1217 21:30:39.289522  638545 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1217 21:30:39.292685  638545 out.go:179] * Using Docker driver with root privileges
	I1217 21:30:39.295519  638545 cni.go:84] Creating CNI manager for "testdata/kube-flannel.yaml"
	I1217 21:30:39.295555  638545 start_flags.go:336] Found "testdata/kube-flannel.yaml" CNI - setting NetworkPlugin=cni
	I1217 21:30:39.295628  638545 start.go:353] cluster config:
	{Name:custom-flannel-675779 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:custom-flannel-675779 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local
ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:testdata/kube-flannel.yaml} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:15m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: Sock
etVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 21:30:39.298842  638545 out.go:179] * Starting "custom-flannel-675779" primary control-plane node in "custom-flannel-675779" cluster
	I1217 21:30:39.301684  638545 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1217 21:30:39.304680  638545 out.go:179] * Pulling base image v0.0.48-1765661130-22141 ...
	I1217 21:30:39.307631  638545 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime containerd
	I1217 21:30:39.307688  638545 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-containerd-overlay2-arm64.tar.lz4
	I1217 21:30:39.307698  638545 cache.go:65] Caching tarball of preloaded images
	I1217 21:30:39.307734  638545 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon
	I1217 21:30:39.307785  638545 preload.go:238] Found /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1217 21:30:39.307801  638545 cache.go:68] Finished verifying existence of preloaded tar for v1.34.3 on containerd
	I1217 21:30:39.307915  638545 profile.go:143] Saving config to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/config.json ...
	I1217 21:30:39.307932  638545 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/config.json: {Name:mk4546de7c5367c9c2e14af4b2ab180fba22676b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 21:30:39.327699  638545 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon, skipping pull
	I1217 21:30:39.327720  638545 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 exists in daemon, skipping load
	I1217 21:30:39.327740  638545 cache.go:243] Successfully downloaded all kic artifacts
	I1217 21:30:39.327773  638545 start.go:360] acquireMachinesLock for custom-flannel-675779: {Name:mk37f8e32c67cac1796d871719ac5834a40c33b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1217 21:30:39.327888  638545 start.go:364] duration metric: took 93.433µs to acquireMachinesLock for "custom-flannel-675779"
	I1217 21:30:39.327921  638545 start.go:93] Provisioning new machine with config: &{Name:custom-flannel-675779 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:custom-flannel-675779 Namespace:default APIServerHAVIP: A
PIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:testdata/kube-flannel.yaml} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:15m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics
:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1217 21:30:39.328002  638545 start.go:125] createHost starting for "" (driver="docker")
	I1217 21:30:39.331454  638545 out.go:252] * Creating docker container (CPUs=2, Memory=3072MB) ...
	I1217 21:30:39.331714  638545 start.go:159] libmachine.API.Create for "custom-flannel-675779" (driver="docker")
	I1217 21:30:39.331750  638545 client.go:173] LocalClient.Create starting
	I1217 21:30:39.331828  638545 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem
	I1217 21:30:39.331871  638545 main.go:143] libmachine: Decoding PEM data...
	I1217 21:30:39.331887  638545 main.go:143] libmachine: Parsing certificate...
	I1217 21:30:39.331940  638545 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem
	I1217 21:30:39.331958  638545 main.go:143] libmachine: Decoding PEM data...
	I1217 21:30:39.331969  638545 main.go:143] libmachine: Parsing certificate...
	I1217 21:30:39.332356  638545 cli_runner.go:164] Run: docker network inspect custom-flannel-675779 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	W1217 21:30:39.348056  638545 cli_runner.go:211] docker network inspect custom-flannel-675779 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
	I1217 21:30:39.348153  638545 network_create.go:284] running [docker network inspect custom-flannel-675779] to gather additional debugging logs...
	I1217 21:30:39.348174  638545 cli_runner.go:164] Run: docker network inspect custom-flannel-675779
	W1217 21:30:39.364040  638545 cli_runner.go:211] docker network inspect custom-flannel-675779 returned with exit code 1
	I1217 21:30:39.364071  638545 network_create.go:287] error running [docker network inspect custom-flannel-675779]: docker network inspect custom-flannel-675779: exit status 1
	stdout:
	[]
	
	stderr:
	Error response from daemon: network custom-flannel-675779 not found
	I1217 21:30:39.364084  638545 network_create.go:289] output of [docker network inspect custom-flannel-675779]: -- stdout --
	[]
	
	-- /stdout --
	** stderr ** 
	Error response from daemon: network custom-flannel-675779 not found
	
	** /stderr **
	I1217 21:30:39.364202  638545 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1217 21:30:39.389450  638545 network.go:211] skipping subnet 192.168.49.0/24 that is taken: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName:br-3e64c97094b7 IfaceIPv4:192.168.49.1 IfaceMTU:1500 IfaceMAC:1a:a4:55:13:27:1d} reservation:<nil>}
	I1217 21:30:39.389839  638545 network.go:211] skipping subnet 192.168.58.0/24 that is taken: &{IP:192.168.58.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.58.0/24 Gateway:192.168.58.1 ClientMin:192.168.58.2 ClientMax:192.168.58.254 Broadcast:192.168.58.255 IsPrivate:true Interface:{IfaceName:br-bc99df562746 IfaceIPv4:192.168.58.1 IfaceMTU:1500 IfaceMAC:1e:36:a1:cd:0c:dc} reservation:<nil>}
	I1217 21:30:39.390086  638545 network.go:211] skipping subnet 192.168.67.0/24 that is taken: &{IP:192.168.67.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.67.0/24 Gateway:192.168.67.1 ClientMin:192.168.67.2 ClientMax:192.168.67.254 Broadcast:192.168.67.255 IsPrivate:true Interface:{IfaceName:br-66979cc3842e IfaceIPv4:192.168.67.1 IfaceMTU:1500 IfaceMAC:d2:e7:b7:c2:a1:6f} reservation:<nil>}
	I1217 21:30:39.390355  638545 network.go:211] skipping subnet 192.168.76.0/24 that is taken: &{IP:192.168.76.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.76.0/24 Gateway:192.168.76.1 ClientMin:192.168.76.2 ClientMax:192.168.76.254 Broadcast:192.168.76.255 IsPrivate:true Interface:{IfaceName:br-acaf092fb27d IfaceIPv4:192.168.76.1 IfaceMTU:1500 IfaceMAC:62:fe:67:13:9e:68} reservation:<nil>}
	I1217 21:30:39.390820  638545 network.go:206] using free private subnet 192.168.85.0/24: &{IP:192.168.85.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.85.0/24 Gateway:192.168.85.1 ClientMin:192.168.85.2 ClientMax:192.168.85.254 Broadcast:192.168.85.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x40019b9790}
	I1217 21:30:39.390847  638545 network_create.go:124] attempt to create docker network custom-flannel-675779 192.168.85.0/24 with gateway 192.168.85.1 and MTU of 1500 ...
	I1217 21:30:39.390913  638545 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.85.0/24 --gateway=192.168.85.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=custom-flannel-675779 custom-flannel-675779
	I1217 21:30:39.447108  638545 network_create.go:108] docker network custom-flannel-675779 192.168.85.0/24 created
	I1217 21:30:39.447142  638545 kic.go:121] calculated static IP "192.168.85.2" for the "custom-flannel-675779" container
	I1217 21:30:39.447215  638545 cli_runner.go:164] Run: docker ps -a --format {{.Names}}
	I1217 21:30:39.463518  638545 cli_runner.go:164] Run: docker volume create custom-flannel-675779 --label name.minikube.sigs.k8s.io=custom-flannel-675779 --label created_by.minikube.sigs.k8s.io=true
	I1217 21:30:39.481232  638545 oci.go:103] Successfully created a docker volume custom-flannel-675779
	I1217 21:30:39.481338  638545 cli_runner.go:164] Run: docker run --rm --name custom-flannel-675779-preload-sidecar --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=custom-flannel-675779 --entrypoint /usr/bin/test -v custom-flannel-675779:/var gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 -d /var/lib
	I1217 21:30:40.076388  638545 oci.go:107] Successfully prepared a docker volume custom-flannel-675779
	I1217 21:30:40.076460  638545 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime containerd
	I1217 21:30:40.076485  638545 kic.go:194] Starting extracting preloaded images to volume ...
	I1217 21:30:40.076571  638545 cli_runner.go:164] Run: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-containerd-overlay2-arm64.tar.lz4:/preloaded.tar:ro -v custom-flannel-675779:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 -I lz4 -xf /preloaded.tar -C /extractDir
	I1217 21:30:44.963544  638545 cli_runner.go:217] Completed: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-containerd-overlay2-arm64.tar.lz4:/preloaded.tar:ro -v custom-flannel-675779:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 -I lz4 -xf /preloaded.tar -C /extractDir: (4.886932386s)
	I1217 21:30:44.963581  638545 kic.go:203] duration metric: took 4.887102619s to extract preloaded images to volume ...
	W1217 21:30:44.963738  638545 cgroups_linux.go:77] Your kernel does not support swap limit capabilities or the cgroup is not mounted.
	I1217 21:30:44.963849  638545 cli_runner.go:164] Run: docker info --format "'{{json .SecurityOptions}}'"
	I1217 21:30:45.076985  638545 cli_runner.go:164] Run: docker run -d -t --privileged --security-opt seccomp=unconfined --tmpfs /tmp --tmpfs /run -v /lib/modules:/lib/modules:ro --hostname custom-flannel-675779 --name custom-flannel-675779 --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=custom-flannel-675779 --label role.minikube.sigs.k8s.io= --label mode.minikube.sigs.k8s.io=custom-flannel-675779 --network custom-flannel-675779 --ip 192.168.85.2 --volume custom-flannel-675779:/var --security-opt apparmor=unconfined --memory=3072mb --cpus=2 -e container=docker --expose 8443 --publish=127.0.0.1::8443 --publish=127.0.0.1::22 --publish=127.0.0.1::2376 --publish=127.0.0.1::5000 --publish=127.0.0.1::32443 gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78
	I1217 21:30:45.482735  638545 cli_runner.go:164] Run: docker container inspect custom-flannel-675779 --format={{.State.Running}}
	I1217 21:30:45.507090  638545 cli_runner.go:164] Run: docker container inspect custom-flannel-675779 --format={{.State.Status}}
	I1217 21:30:45.538030  638545 cli_runner.go:164] Run: docker exec custom-flannel-675779 stat /var/lib/dpkg/alternatives/iptables
	I1217 21:30:45.601697  638545 oci.go:144] the created container "custom-flannel-675779" has a running status.
	I1217 21:30:45.601731  638545 kic.go:225] Creating ssh key for kic: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/custom-flannel-675779/id_rsa...
	I1217 21:30:45.881976  638545 kic_runner.go:191] docker (temp): /home/jenkins/minikube-integration/21808-367595/.minikube/machines/custom-flannel-675779/id_rsa.pub --> /home/docker/.ssh/authorized_keys (381 bytes)
	I1217 21:30:45.906294  638545 cli_runner.go:164] Run: docker container inspect custom-flannel-675779 --format={{.State.Status}}
	I1217 21:30:45.932074  638545 kic_runner.go:93] Run: chown docker:docker /home/docker/.ssh/authorized_keys
	I1217 21:30:45.932095  638545 kic_runner.go:114] Args: [docker exec --privileged custom-flannel-675779 chown docker:docker /home/docker/.ssh/authorized_keys]
	I1217 21:30:46.000700  638545 cli_runner.go:164] Run: docker container inspect custom-flannel-675779 --format={{.State.Status}}
	I1217 21:30:46.040816  638545 machine.go:94] provisionDockerMachine start ...
	I1217 21:30:46.040909  638545 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" custom-flannel-675779
	I1217 21:30:46.068573  638545 main.go:143] libmachine: Using SSH client type: native
	I1217 21:30:46.068933  638545 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33448 <nil> <nil>}
	I1217 21:30:46.068944  638545 main.go:143] libmachine: About to run SSH command:
	hostname
	I1217 21:30:46.069704  638545 main.go:143] libmachine: Error dialing TCP: ssh: handshake failed: EOF
	I1217 21:30:49.204076  638545 main.go:143] libmachine: SSH cmd err, output: <nil>: custom-flannel-675779
	
	I1217 21:30:49.204099  638545 ubuntu.go:182] provisioning hostname "custom-flannel-675779"
	I1217 21:30:49.204162  638545 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" custom-flannel-675779
	I1217 21:30:49.222701  638545 main.go:143] libmachine: Using SSH client type: native
	I1217 21:30:49.223022  638545 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33448 <nil> <nil>}
	I1217 21:30:49.223034  638545 main.go:143] libmachine: About to run SSH command:
	sudo hostname custom-flannel-675779 && echo "custom-flannel-675779" | sudo tee /etc/hostname
	I1217 21:30:49.383031  638545 main.go:143] libmachine: SSH cmd err, output: <nil>: custom-flannel-675779
	
	I1217 21:30:49.383134  638545 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" custom-flannel-675779
	I1217 21:30:49.402865  638545 main.go:143] libmachine: Using SSH client type: native
	I1217 21:30:49.403183  638545 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33448 <nil> <nil>}
	I1217 21:30:49.403205  638545 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\scustom-flannel-675779' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 custom-flannel-675779/g' /etc/hosts;
				else 
					echo '127.0.1.1 custom-flannel-675779' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1217 21:30:49.536421  638545 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1217 21:30:49.536450  638545 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21808-367595/.minikube CaCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21808-367595/.minikube}
	I1217 21:30:49.536479  638545 ubuntu.go:190] setting up certificates
	I1217 21:30:49.536488  638545 provision.go:84] configureAuth start
	I1217 21:30:49.536557  638545 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" custom-flannel-675779
	I1217 21:30:49.553605  638545 provision.go:143] copyHostCerts
	I1217 21:30:49.553671  638545 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem, removing ...
	I1217 21:30:49.553689  638545 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem
	I1217 21:30:49.553767  638545 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/ca.pem (1082 bytes)
	I1217 21:30:49.553863  638545 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem, removing ...
	I1217 21:30:49.553873  638545 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem
	I1217 21:30:49.553900  638545 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/cert.pem (1123 bytes)
	I1217 21:30:49.553956  638545 exec_runner.go:144] found /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem, removing ...
	I1217 21:30:49.553964  638545 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem
	I1217 21:30:49.553988  638545 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21808-367595/.minikube/key.pem (1679 bytes)
	I1217 21:30:49.554040  638545 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem org=jenkins.custom-flannel-675779 san=[127.0.0.1 192.168.85.2 custom-flannel-675779 localhost minikube]
	I1217 21:30:49.823490  638545 provision.go:177] copyRemoteCerts
	I1217 21:30:49.823574  638545 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1217 21:30:49.823620  638545 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" custom-flannel-675779
	I1217 21:30:49.841145  638545 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33448 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/custom-flannel-675779/id_rsa Username:docker}
	I1217 21:30:49.937486  638545 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1217 21:30:49.955699  638545 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1217 21:30:49.974783  638545 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/machines/server.pem --> /etc/docker/server.pem (1233 bytes)
	I1217 21:30:49.996319  638545 provision.go:87] duration metric: took 459.813478ms to configureAuth
	I1217 21:30:49.996357  638545 ubuntu.go:206] setting minikube options for container-runtime
	I1217 21:30:49.996586  638545 config.go:182] Loaded profile config "custom-flannel-675779": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1217 21:30:49.996601  638545 machine.go:97] duration metric: took 3.955766455s to provisionDockerMachine
	I1217 21:30:49.996610  638545 client.go:176] duration metric: took 10.664852891s to LocalClient.Create
	I1217 21:30:49.996624  638545 start.go:167] duration metric: took 10.664912018s to libmachine.API.Create "custom-flannel-675779"
	I1217 21:30:49.996636  638545 start.go:293] postStartSetup for "custom-flannel-675779" (driver="docker")
	I1217 21:30:49.996658  638545 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1217 21:30:49.996737  638545 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1217 21:30:49.996790  638545 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" custom-flannel-675779
	I1217 21:30:50.016562  638545 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33448 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/custom-flannel-675779/id_rsa Username:docker}
	I1217 21:30:50.125246  638545 ssh_runner.go:195] Run: cat /etc/os-release
	I1217 21:30:50.128752  638545 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1217 21:30:50.128783  638545 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1217 21:30:50.128795  638545 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/addons for local assets ...
	I1217 21:30:50.128853  638545 filesync.go:126] Scanning /home/jenkins/minikube-integration/21808-367595/.minikube/files for local assets ...
	I1217 21:30:50.128941  638545 filesync.go:149] local asset: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem -> 3694612.pem in /etc/ssl/certs
	I1217 21:30:50.129049  638545 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I1217 21:30:50.136922  638545 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 21:30:50.156137  638545 start.go:296] duration metric: took 159.473428ms for postStartSetup
	I1217 21:30:50.156613  638545 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" custom-flannel-675779
	I1217 21:30:50.174468  638545 profile.go:143] Saving config to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/config.json ...
	I1217 21:30:50.174764  638545 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1217 21:30:50.174825  638545 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" custom-flannel-675779
	I1217 21:30:50.193113  638545 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33448 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/custom-flannel-675779/id_rsa Username:docker}
	I1217 21:30:50.286171  638545 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1217 21:30:50.291263  638545 start.go:128] duration metric: took 10.963245645s to createHost
	I1217 21:30:50.291287  638545 start.go:83] releasing machines lock for "custom-flannel-675779", held for 10.963384125s
	I1217 21:30:50.291382  638545 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" custom-flannel-675779
	I1217 21:30:50.308762  638545 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 21:30:50.308821  638545 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 21:30:50.308831  638545 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 21:30:50.308866  638545 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 21:30:50.308896  638545 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 21:30:50.308931  638545 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 21:30:50.308985  638545 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 21:30:50.309051  638545 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 21:30:50.309111  638545 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" custom-flannel-675779
	I1217 21:30:50.331165  638545 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33448 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/custom-flannel-675779/id_rsa Username:docker}
	I1217 21:30:50.439336  638545 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 21:30:50.458026  638545 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 21:30:50.476122  638545 ssh_runner.go:195] Run: openssl version
	I1217 21:30:50.484694  638545 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 21:30:50.492875  638545 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 21:30:50.500548  638545 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 21:30:50.504587  638545 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 21:30:50.504684  638545 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 21:30:50.548352  638545 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 21:30:50.555855  638545 ssh_runner.go:195] Run: sudo ln -fs /etc/ssl/certs/369461.pem /etc/ssl/certs/51391683.0
	I1217 21:30:50.563177  638545 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 21:30:50.570672  638545 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 21:30:50.579205  638545 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 21:30:50.583048  638545 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 21:30:50.583137  638545 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 21:30:50.629009  638545 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 21:30:50.636605  638545 ssh_runner.go:195] Run: sudo ln -fs /etc/ssl/certs/3694612.pem /etc/ssl/certs/3ec20f2e.0
	I1217 21:30:50.643984  638545 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 21:30:50.651317  638545 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 21:30:50.658762  638545 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 21:30:50.662656  638545 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 21:30:50.662763  638545 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 21:30:50.704141  638545 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 21:30:50.711696  638545 ssh_runner.go:195] Run: sudo ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0
	I1217 21:30:50.719134  638545 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-certificates >/dev/null 2>&1 && sudo update-ca-certificates || true"
	I1217 21:30:50.722780  638545 ssh_runner.go:195] Run: /bin/sh -c "command -v update-ca-trust >/dev/null 2>&1 && sudo update-ca-trust extract || true"
	I1217 21:30:50.726407  638545 ssh_runner.go:195] Run: cat /version.json
	I1217 21:30:50.726515  638545 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1217 21:30:50.836489  638545 ssh_runner.go:195] Run: systemctl --version
	I1217 21:30:50.843078  638545 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1217 21:30:50.847298  638545 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1217 21:30:50.847393  638545 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1217 21:30:50.877064  638545 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist, /etc/cni/net.d/10-crio-bridge.conflist.disabled] bridge cni config(s)
	I1217 21:30:50.877131  638545 start.go:496] detecting cgroup driver to use...
	I1217 21:30:50.877177  638545 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1217 21:30:50.877253  638545 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1217 21:30:50.892111  638545 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1217 21:30:50.905403  638545 docker.go:218] disabling cri-docker service (if available) ...
	I1217 21:30:50.905477  638545 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1217 21:30:50.923035  638545 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1217 21:30:50.941614  638545 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1217 21:30:51.058047  638545 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1217 21:30:51.180259  638545 docker.go:234] disabling docker service ...
	I1217 21:30:51.180350  638545 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1217 21:30:51.202938  638545 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1217 21:30:51.216845  638545 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1217 21:30:51.340116  638545 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1217 21:30:51.460831  638545 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1217 21:30:51.473749  638545 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1217 21:30:51.489418  638545 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1217 21:30:51.500978  638545 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1217 21:30:51.510831  638545 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1217 21:30:51.510980  638545 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1217 21:30:51.534990  638545 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 21:30:51.548532  638545 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1217 21:30:51.557404  638545 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1217 21:30:51.568895  638545 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1217 21:30:51.577406  638545 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1217 21:30:51.587242  638545 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1217 21:30:51.596386  638545 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1217 21:30:51.605882  638545 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1217 21:30:51.614396  638545 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1217 21:30:51.621974  638545 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 21:30:51.734602  638545 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1217 21:30:51.868677  638545 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1217 21:30:51.868753  638545 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1217 21:30:51.872722  638545 start.go:564] Will wait 60s for crictl version
	I1217 21:30:51.872787  638545 ssh_runner.go:195] Run: which crictl
	I1217 21:30:51.876419  638545 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1217 21:30:51.905291  638545 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1217 21:30:51.905362  638545 ssh_runner.go:195] Run: containerd --version
	I1217 21:30:51.928790  638545 ssh_runner.go:195] Run: containerd --version
	I1217 21:30:51.957172  638545 out.go:179] * Preparing Kubernetes v1.34.3 on containerd 2.2.0 ...
	I1217 21:30:51.960239  638545 cli_runner.go:164] Run: docker network inspect custom-flannel-675779 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1217 21:30:51.976480  638545 ssh_runner.go:195] Run: grep 192.168.85.1	host.minikube.internal$ /etc/hosts
	I1217 21:30:51.980481  638545 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.85.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1217 21:30:51.990392  638545 kubeadm.go:884] updating cluster {Name:custom-flannel-675779 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:custom-flannel-675779 Namespace:default APIServerHAVIP: APIServerName:miniku
beCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:testdata/kube-flannel.yaml} Nodes:[{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:15m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false
DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1217 21:30:51.990510  638545 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime containerd
	I1217 21:30:51.990584  638545 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 21:30:52.022309  638545 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 21:30:52.022336  638545 containerd.go:534] Images already preloaded, skipping extraction
	I1217 21:30:52.022402  638545 ssh_runner.go:195] Run: sudo crictl images --output json
	I1217 21:30:52.050722  638545 containerd.go:627] all images are preloaded for containerd runtime.
	I1217 21:30:52.050747  638545 cache_images.go:86] Images are preloaded, skipping loading
	I1217 21:30:52.050754  638545 kubeadm.go:935] updating node { 192.168.85.2 8443 v1.34.3 containerd true true} ...
	I1217 21:30:52.050894  638545 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.34.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=custom-flannel-675779 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.85.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.34.3 ClusterName:custom-flannel-675779 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:testdata/kube-flannel.yaml}
	I1217 21:30:52.050965  638545 ssh_runner.go:195] Run: sudo crictl info
	I1217 21:30:52.077975  638545 cni.go:84] Creating CNI manager for "testdata/kube-flannel.yaml"
	I1217 21:30:52.078017  638545 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1217 21:30:52.078042  638545 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.85.2 APIServerPort:8443 KubernetesVersion:v1.34.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:custom-flannel-675779 NodeName:custom-flannel-675779 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.85.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.85.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt St
aticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1217 21:30:52.078166  638545 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.85.2
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "custom-flannel-675779"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.85.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.85.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.34.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1217 21:30:52.078239  638545 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.34.3
	I1217 21:30:52.086331  638545 binaries.go:51] Found k8s binaries, skipping transfer
	I1217 21:30:52.086437  638545 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1217 21:30:52.094534  638545 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (325 bytes)
	I1217 21:30:52.108176  638545 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I1217 21:30:52.122675  638545 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2234 bytes)
	I1217 21:30:52.136501  638545 ssh_runner.go:195] Run: grep 192.168.85.2	control-plane.minikube.internal$ /etc/hosts
	I1217 21:30:52.140391  638545 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.85.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1217 21:30:52.150861  638545 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 21:30:52.274609  638545 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1217 21:30:52.301983  638545 certs.go:69] Setting up /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779 for IP: 192.168.85.2
	I1217 21:30:52.302055  638545 certs.go:195] generating shared ca certs ...
	I1217 21:30:52.302087  638545 certs.go:227] acquiring lock for ca certs: {Name:mk528c7ee25f2f3d78de33f266a77f908cb2a9d0 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 21:30:52.302256  638545 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key
	I1217 21:30:52.302337  638545 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key
	I1217 21:30:52.302367  638545 certs.go:257] generating profile certs ...
	I1217 21:30:52.302450  638545 certs.go:364] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/client.key
	I1217 21:30:52.302487  638545 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/client.crt with IP's: []
	I1217 21:30:52.657885  638545 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/client.crt ...
	I1217 21:30:52.657918  638545 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/client.crt: {Name:mkd8125dfe0e14dd6914a4f6725a09f09280d791 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 21:30:52.658707  638545 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/client.key ...
	I1217 21:30:52.658729  638545 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/client.key: {Name:mk03085feebe2a02b15cf95503481e4f30d60e6c Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 21:30:52.659518  638545 certs.go:364] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/apiserver.key.80b34340
	I1217 21:30:52.659546  638545 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/apiserver.crt.80b34340 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.85.2]
	I1217 21:30:52.800164  638545 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/apiserver.crt.80b34340 ...
	I1217 21:30:52.800198  638545 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/apiserver.crt.80b34340: {Name:mke3c847d2637d29bf523c0dcf6b2ce0d0e76988 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 21:30:52.801009  638545 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/apiserver.key.80b34340 ...
	I1217 21:30:52.801031  638545 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/apiserver.key.80b34340: {Name:mk7cea36725a9b9a13b31b40a19df3704436fa23 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 21:30:52.801729  638545 certs.go:382] copying /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/apiserver.crt.80b34340 -> /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/apiserver.crt
	I1217 21:30:52.801829  638545 certs.go:386] copying /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/apiserver.key.80b34340 -> /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/apiserver.key
	I1217 21:30:52.801982  638545 certs.go:364] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/proxy-client.key
	I1217 21:30:52.802008  638545 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/proxy-client.crt with IP's: []
	I1217 21:30:53.083509  638545 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/proxy-client.crt ...
	I1217 21:30:53.083542  638545 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/proxy-client.crt: {Name:mk7b93f924024fdc47dd5a2dc83b0207324b2a3d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 21:30:53.084441  638545 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/proxy-client.key ...
	I1217 21:30:53.084463  638545 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/proxy-client.key: {Name:mka70133f6937b2ac65c502da6b77ebc61b73813 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 21:30:53.085294  638545 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem (1338 bytes)
	W1217 21:30:53.085346  638545 certs.go:480] ignoring /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461_empty.pem, impossibly tiny 0 bytes
	I1217 21:30:53.085361  638545 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca-key.pem (1675 bytes)
	I1217 21:30:53.085389  638545 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/ca.pem (1082 bytes)
	I1217 21:30:53.085418  638545 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/cert.pem (1123 bytes)
	I1217 21:30:53.085447  638545 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/certs/key.pem (1679 bytes)
	I1217 21:30:53.085501  638545 certs.go:484] found cert: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem (1708 bytes)
	I1217 21:30:53.086077  638545 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1217 21:30:53.107772  638545 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1217 21:30:53.129015  638545 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1217 21:30:53.155321  638545 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1217 21:30:53.174413  638545 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1432 bytes)
	I1217 21:30:53.192589  638545 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1217 21:30:53.211382  638545 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1217 21:30:53.229636  638545 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1217 21:30:53.247395  638545 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/certs/369461.pem --> /usr/share/ca-certificates/369461.pem (1338 bytes)
	I1217 21:30:53.265591  638545 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/ssl/certs/3694612.pem --> /usr/share/ca-certificates/3694612.pem (1708 bytes)
	I1217 21:30:53.284026  638545 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1217 21:30:53.302663  638545 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1217 21:30:53.316182  638545 ssh_runner.go:195] Run: openssl version
	I1217 21:30:53.323067  638545 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1217 21:30:53.331127  638545 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1217 21:30:53.339427  638545 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1217 21:30:53.343361  638545 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 17 20:08 /usr/share/ca-certificates/minikubeCA.pem
	I1217 21:30:53.343430  638545 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1217 21:30:53.384882  638545 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1217 21:30:53.392761  638545 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/369461.pem
	I1217 21:30:53.400433  638545 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/369461.pem /etc/ssl/certs/369461.pem
	I1217 21:30:53.408509  638545 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/369461.pem
	I1217 21:30:53.412518  638545 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 17 20:17 /usr/share/ca-certificates/369461.pem
	I1217 21:30:53.412583  638545 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/369461.pem
	I1217 21:30:53.459079  638545 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1217 21:30:53.467049  638545 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/3694612.pem
	I1217 21:30:53.474870  638545 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/3694612.pem /etc/ssl/certs/3694612.pem
	I1217 21:30:53.482946  638545 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3694612.pem
	I1217 21:30:53.486930  638545 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 17 20:17 /usr/share/ca-certificates/3694612.pem
	I1217 21:30:53.487044  638545 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3694612.pem
	I1217 21:30:53.528929  638545 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1217 21:30:53.537216  638545 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1217 21:30:53.541286  638545 certs.go:400] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I1217 21:30:53.541348  638545 kubeadm.go:401] StartCluster: {Name:custom-flannel-675779 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:custom-flannel-675779 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:testdata/kube-flannel.yaml} Nodes:[{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:15m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Dis
ableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 21:30:53.541430  638545 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1217 21:30:53.541487  638545 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1217 21:30:53.568833  638545 cri.go:89] found id: ""
	I1217 21:30:53.568920  638545 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1217 21:30:53.577696  638545 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1217 21:30:53.586985  638545 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1217 21:30:53.587078  638545 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1217 21:30:53.595661  638545 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1217 21:30:53.595690  638545 kubeadm.go:158] found existing configuration files:
	
	I1217 21:30:53.595746  638545 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1217 21:30:53.604450  638545 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1217 21:30:53.604530  638545 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1217 21:30:53.612795  638545 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1217 21:30:53.620821  638545 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1217 21:30:53.620938  638545 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1217 21:30:53.628551  638545 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1217 21:30:53.636911  638545 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1217 21:30:53.637000  638545 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1217 21:30:53.644953  638545 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1217 21:30:53.653520  638545 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1217 21:30:53.653617  638545 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1217 21:30:53.661547  638545 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.34.3:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1217 21:30:53.743828  638545 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is in maintenance mode, please migrate to cgroups v2
	I1217 21:30:53.744402  638545 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1217 21:30:53.835754  638545 kubeadm.go:319] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1217 21:31:12.554478  638545 kubeadm.go:319] [init] Using Kubernetes version: v1.34.3
	I1217 21:31:12.554535  638545 kubeadm.go:319] [preflight] Running pre-flight checks
	I1217 21:31:12.554619  638545 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1217 21:31:12.554672  638545 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1217 21:31:12.554706  638545 kubeadm.go:319] OS: Linux
	I1217 21:31:12.554749  638545 kubeadm.go:319] CGROUPS_CPU: enabled
	I1217 21:31:12.554795  638545 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1217 21:31:12.554839  638545 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1217 21:31:12.554885  638545 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1217 21:31:12.554931  638545 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1217 21:31:12.554976  638545 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1217 21:31:12.555019  638545 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1217 21:31:12.555065  638545 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1217 21:31:12.555108  638545 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1217 21:31:12.555177  638545 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1217 21:31:12.555267  638545 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1217 21:31:12.555354  638545 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1217 21:31:12.555413  638545 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1217 21:31:12.558417  638545 out.go:252]   - Generating certificates and keys ...
	I1217 21:31:12.558510  638545 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1217 21:31:12.558574  638545 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1217 21:31:12.558642  638545 kubeadm.go:319] [certs] Generating "apiserver-kubelet-client" certificate and key
	I1217 21:31:12.558704  638545 kubeadm.go:319] [certs] Generating "front-proxy-ca" certificate and key
	I1217 21:31:12.558764  638545 kubeadm.go:319] [certs] Generating "front-proxy-client" certificate and key
	I1217 21:31:12.558816  638545 kubeadm.go:319] [certs] Generating "etcd/ca" certificate and key
	I1217 21:31:12.558869  638545 kubeadm.go:319] [certs] Generating "etcd/server" certificate and key
	I1217 21:31:12.558994  638545 kubeadm.go:319] [certs] etcd/server serving cert is signed for DNS names [custom-flannel-675779 localhost] and IPs [192.168.85.2 127.0.0.1 ::1]
	I1217 21:31:12.559047  638545 kubeadm.go:319] [certs] Generating "etcd/peer" certificate and key
	I1217 21:31:12.559169  638545 kubeadm.go:319] [certs] etcd/peer serving cert is signed for DNS names [custom-flannel-675779 localhost] and IPs [192.168.85.2 127.0.0.1 ::1]
	I1217 21:31:12.559236  638545 kubeadm.go:319] [certs] Generating "etcd/healthcheck-client" certificate and key
	I1217 21:31:12.559298  638545 kubeadm.go:319] [certs] Generating "apiserver-etcd-client" certificate and key
	I1217 21:31:12.559342  638545 kubeadm.go:319] [certs] Generating "sa" key and public key
	I1217 21:31:12.559421  638545 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1217 21:31:12.559473  638545 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1217 21:31:12.559529  638545 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1217 21:31:12.559585  638545 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1217 21:31:12.559653  638545 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1217 21:31:12.559707  638545 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1217 21:31:12.559788  638545 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1217 21:31:12.559854  638545 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1217 21:31:12.562915  638545 out.go:252]   - Booting up control plane ...
	I1217 21:31:12.563041  638545 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1217 21:31:12.563143  638545 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1217 21:31:12.563220  638545 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1217 21:31:12.563322  638545 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1217 21:31:12.563424  638545 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1217 21:31:12.563533  638545 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1217 21:31:12.563622  638545 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1217 21:31:12.563700  638545 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1217 21:31:12.563900  638545 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1217 21:31:12.564053  638545 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1217 21:31:12.564122  638545 kubeadm.go:319] [kubelet-check] The kubelet is healthy after 1.501064036s
	I1217 21:31:12.564226  638545 kubeadm.go:319] [control-plane-check] Waiting for healthy control plane components. This can take up to 4m0s
	I1217 21:31:12.564450  638545 kubeadm.go:319] [control-plane-check] Checking kube-apiserver at https://192.168.85.2:8443/livez
	I1217 21:31:12.564780  638545 kubeadm.go:319] [control-plane-check] Checking kube-controller-manager at https://127.0.0.1:10257/healthz
	I1217 21:31:12.564875  638545 kubeadm.go:319] [control-plane-check] Checking kube-scheduler at https://127.0.0.1:10259/livez
	I1217 21:31:12.564976  638545 kubeadm.go:319] [control-plane-check] kube-controller-manager is healthy after 3.950371563s
	I1217 21:31:12.565109  638545 kubeadm.go:319] [control-plane-check] kube-scheduler is healthy after 5.049963489s
	I1217 21:31:12.565188  638545 kubeadm.go:319] [control-plane-check] kube-apiserver is healthy after 7.002287017s
	I1217 21:31:12.565301  638545 kubeadm.go:319] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I1217 21:31:12.565430  638545 kubeadm.go:319] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I1217 21:31:12.565500  638545 kubeadm.go:319] [upload-certs] Skipping phase. Please see --upload-certs
	I1217 21:31:12.565696  638545 kubeadm.go:319] [mark-control-plane] Marking the node custom-flannel-675779 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I1217 21:31:12.565758  638545 kubeadm.go:319] [bootstrap-token] Using token: c1kkgx.xe0kq6br3m9hjeki
	I1217 21:31:12.570414  638545 out.go:252]   - Configuring RBAC rules ...
	I1217 21:31:12.570548  638545 kubeadm.go:319] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I1217 21:31:12.570638  638545 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I1217 21:31:12.570785  638545 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I1217 21:31:12.570917  638545 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I1217 21:31:12.571035  638545 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I1217 21:31:12.571126  638545 kubeadm.go:319] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I1217 21:31:12.571244  638545 kubeadm.go:319] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I1217 21:31:12.571293  638545 kubeadm.go:319] [addons] Applied essential addon: CoreDNS
	I1217 21:31:12.571344  638545 kubeadm.go:319] [addons] Applied essential addon: kube-proxy
	I1217 21:31:12.571352  638545 kubeadm.go:319] 
	I1217 21:31:12.571412  638545 kubeadm.go:319] Your Kubernetes control-plane has initialized successfully!
	I1217 21:31:12.571419  638545 kubeadm.go:319] 
	I1217 21:31:12.571496  638545 kubeadm.go:319] To start using your cluster, you need to run the following as a regular user:
	I1217 21:31:12.571503  638545 kubeadm.go:319] 
	I1217 21:31:12.571528  638545 kubeadm.go:319]   mkdir -p $HOME/.kube
	I1217 21:31:12.571599  638545 kubeadm.go:319]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I1217 21:31:12.571660  638545 kubeadm.go:319]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I1217 21:31:12.571667  638545 kubeadm.go:319] 
	I1217 21:31:12.571722  638545 kubeadm.go:319] Alternatively, if you are the root user, you can run:
	I1217 21:31:12.571729  638545 kubeadm.go:319] 
	I1217 21:31:12.571776  638545 kubeadm.go:319]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I1217 21:31:12.571786  638545 kubeadm.go:319] 
	I1217 21:31:12.571845  638545 kubeadm.go:319] You should now deploy a pod network to the cluster.
	I1217 21:31:12.571924  638545 kubeadm.go:319] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I1217 21:31:12.571998  638545 kubeadm.go:319]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I1217 21:31:12.572006  638545 kubeadm.go:319] 
	I1217 21:31:12.572089  638545 kubeadm.go:319] You can now join any number of control-plane nodes by copying certificate authorities
	I1217 21:31:12.572170  638545 kubeadm.go:319] and service account keys on each node and then running the following as root:
	I1217 21:31:12.572181  638545 kubeadm.go:319] 
	I1217 21:31:12.572310  638545 kubeadm.go:319]   kubeadm join control-plane.minikube.internal:8443 --token c1kkgx.xe0kq6br3m9hjeki \
	I1217 21:31:12.572419  638545 kubeadm.go:319] 	--discovery-token-ca-cert-hash sha256:09ac3e211c45a99f82593998997af866fbad385d247bea2aac14334cee5675a4 \
	I1217 21:31:12.572443  638545 kubeadm.go:319] 	--control-plane 
	I1217 21:31:12.572450  638545 kubeadm.go:319] 
	I1217 21:31:12.572536  638545 kubeadm.go:319] Then you can join any number of worker nodes by running the following on each as root:
	I1217 21:31:12.572544  638545 kubeadm.go:319] 
	I1217 21:31:12.572626  638545 kubeadm.go:319] kubeadm join control-plane.minikube.internal:8443 --token c1kkgx.xe0kq6br3m9hjeki \
	I1217 21:31:12.572747  638545 kubeadm.go:319] 	--discovery-token-ca-cert-hash sha256:09ac3e211c45a99f82593998997af866fbad385d247bea2aac14334cee5675a4 
	I1217 21:31:12.572759  638545 cni.go:84] Creating CNI manager for "testdata/kube-flannel.yaml"
	I1217 21:31:12.575988  638545 out.go:179] * Configuring testdata/kube-flannel.yaml (Container Networking Interface) ...
	I1217 21:31:12.578826  638545 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.34.3/kubectl ...
	I1217 21:31:12.578912  638545 ssh_runner.go:195] Run: stat -c "%s %y" /var/tmp/minikube/cni.yaml
	I1217 21:31:12.582911  638545 ssh_runner.go:352] existence check for /var/tmp/minikube/cni.yaml: stat -c "%s %y" /var/tmp/minikube/cni.yaml: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/tmp/minikube/cni.yaml': No such file or directory
	I1217 21:31:12.582939  638545 ssh_runner.go:362] scp testdata/kube-flannel.yaml --> /var/tmp/minikube/cni.yaml (4578 bytes)
	I1217 21:31:12.604764  638545 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I1217 21:31:13.135241  638545 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I1217 21:31:13.135406  638545 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes custom-flannel-675779 minikube.k8s.io/updated_at=2025_12_17T21_31_13_0700 minikube.k8s.io/version=v1.37.0 minikube.k8s.io/commit=e7c291d147a7a4c759554efbd6d659a1a65fa869 minikube.k8s.io/name=custom-flannel-675779 minikube.k8s.io/primary=true
	I1217 21:31:13.135409  638545 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I1217 21:31:13.329759  638545 ops.go:34] apiserver oom_adj: -16
	I1217 21:31:13.329867  638545 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1217 21:31:13.830047  638545 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1217 21:31:14.330413  638545 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1217 21:31:14.830377  638545 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1217 21:31:15.330677  638545 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1217 21:31:15.830446  638545 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1217 21:31:16.329986  638545 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1217 21:31:16.427869  638545 kubeadm.go:1114] duration metric: took 3.292596132s to wait for elevateKubeSystemPrivileges
	I1217 21:31:16.427905  638545 kubeadm.go:403] duration metric: took 22.886560413s to StartCluster
	I1217 21:31:16.427923  638545 settings.go:142] acquiring lock: {Name:mkec67bf414aabef990098a6cc4910956f0d3622 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 21:31:16.427991  638545 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 21:31:16.428933  638545 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21808-367595/kubeconfig: {Name:mk68b516071fc5d9da0842bf56ff4d318cea3c03 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1217 21:31:16.429181  638545 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I1217 21:31:16.429187  638545 start.go:236] Will wait 15m0s for node &{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1217 21:31:16.429469  638545 config.go:182] Loaded profile config "custom-flannel-675779": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1217 21:31:16.429596  638545 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1217 21:31:16.429660  638545 addons.go:70] Setting storage-provisioner=true in profile "custom-flannel-675779"
	I1217 21:31:16.429672  638545 addons.go:239] Setting addon storage-provisioner=true in "custom-flannel-675779"
	I1217 21:31:16.429695  638545 host.go:66] Checking if "custom-flannel-675779" exists ...
	I1217 21:31:16.430188  638545 cli_runner.go:164] Run: docker container inspect custom-flannel-675779 --format={{.State.Status}}
	I1217 21:31:16.430710  638545 addons.go:70] Setting default-storageclass=true in profile "custom-flannel-675779"
	I1217 21:31:16.430733  638545 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "custom-flannel-675779"
	I1217 21:31:16.431028  638545 cli_runner.go:164] Run: docker container inspect custom-flannel-675779 --format={{.State.Status}}
	I1217 21:31:16.434877  638545 out.go:179] * Verifying Kubernetes components...
	I1217 21:31:16.442445  638545 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1217 21:31:16.463045  638545 addons.go:239] Setting addon default-storageclass=true in "custom-flannel-675779"
	I1217 21:31:16.463085  638545 host.go:66] Checking if "custom-flannel-675779" exists ...
	I1217 21:31:16.463502  638545 cli_runner.go:164] Run: docker container inspect custom-flannel-675779 --format={{.State.Status}}
	I1217 21:31:16.473139  638545 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1217 21:31:16.476013  638545 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 21:31:16.476036  638545 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1217 21:31:16.476098  638545 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" custom-flannel-675779
	I1217 21:31:16.500982  638545 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1217 21:31:16.501004  638545 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1217 21:31:16.501066  638545 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" custom-flannel-675779
	I1217 21:31:16.509941  638545 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33448 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/custom-flannel-675779/id_rsa Username:docker}
	I1217 21:31:16.535131  638545 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33448 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/custom-flannel-675779/id_rsa Username:docker}
	I1217 21:31:16.680861  638545 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.85.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.34.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I1217 21:31:16.709850  638545 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1217 21:31:16.741899  638545 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1217 21:31:16.835174  638545 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1217 21:31:17.281818  638545 start.go:1013] {"host.minikube.internal": 192.168.85.1} host record injected into CoreDNS's ConfigMap
	I1217 21:31:17.283689  638545 node_ready.go:35] waiting up to 15m0s for node "custom-flannel-675779" to be "Ready" ...
	I1217 21:31:17.791113  638545 kapi.go:214] "coredns" deployment in "kube-system" namespace and "custom-flannel-675779" context rescaled to 1 replicas
	I1217 21:31:17.847446  638545 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: (1.012235783s)
	I1217 21:31:17.851597  638545 out.go:179] * Enabled addons: default-storageclass, storage-provisioner
	I1217 21:31:17.854427  638545 addons.go:530] duration metric: took 1.424827057s for enable addons: enabled=[default-storageclass storage-provisioner]
	I1217 21:31:23.155243  588228 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1217 21:31:23.155281  588228 kubeadm.go:319] 
	I1217 21:31:23.155353  588228 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1217 21:31:23.156267  588228 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1217 21:31:23.156329  588228 kubeadm.go:319] [preflight] Running pre-flight checks
	I1217 21:31:23.156453  588228 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1217 21:31:23.156522  588228 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1217 21:31:23.156557  588228 kubeadm.go:319] OS: Linux
	I1217 21:31:23.156603  588228 kubeadm.go:319] CGROUPS_CPU: enabled
	I1217 21:31:23.156652  588228 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1217 21:31:23.156699  588228 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1217 21:31:23.156747  588228 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1217 21:31:23.156795  588228 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1217 21:31:23.156844  588228 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1217 21:31:23.156889  588228 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1217 21:31:23.156937  588228 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1217 21:31:23.156983  588228 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1217 21:31:23.157055  588228 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1217 21:31:23.157150  588228 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1217 21:31:23.157239  588228 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1217 21:31:23.157301  588228 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1217 21:31:23.160332  588228 out.go:252]   - Generating certificates and keys ...
	I1217 21:31:23.160435  588228 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1217 21:31:23.160513  588228 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1217 21:31:23.160596  588228 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1217 21:31:23.160663  588228 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1217 21:31:23.160763  588228 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1217 21:31:23.160838  588228 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1217 21:31:23.160911  588228 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1217 21:31:23.160983  588228 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1217 21:31:23.161063  588228 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1217 21:31:23.161157  588228 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1217 21:31:23.161206  588228 kubeadm.go:319] [certs] Using the existing "sa" key
	I1217 21:31:23.161274  588228 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1217 21:31:23.161329  588228 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1217 21:31:23.161392  588228 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1217 21:31:23.161449  588228 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1217 21:31:23.161516  588228 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1217 21:31:23.161574  588228 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1217 21:31:23.161662  588228 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1217 21:31:23.161730  588228 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1217 21:31:23.164765  588228 out.go:252]   - Booting up control plane ...
	I1217 21:31:23.164890  588228 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1217 21:31:23.164975  588228 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1217 21:31:23.165047  588228 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1217 21:31:23.165154  588228 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1217 21:31:23.165254  588228 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1217 21:31:23.165361  588228 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1217 21:31:23.165447  588228 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1217 21:31:23.165499  588228 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1217 21:31:23.165630  588228 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1217 21:31:23.165739  588228 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1217 21:31:23.165807  588228 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000869511s
	I1217 21:31:23.165815  588228 kubeadm.go:319] 
	I1217 21:31:23.165872  588228 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1217 21:31:23.165908  588228 kubeadm.go:319] 	- The kubelet is not running
	I1217 21:31:23.166015  588228 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1217 21:31:23.166022  588228 kubeadm.go:319] 
	I1217 21:31:23.166126  588228 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1217 21:31:23.166161  588228 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1217 21:31:23.166194  588228 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1217 21:31:23.166264  588228 kubeadm.go:403] duration metric: took 12m19.05543739s to StartCluster
	I1217 21:31:23.166300  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1217 21:31:23.166368  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1217 21:31:23.166466  588228 kubeadm.go:319] 
	I1217 21:31:23.193416  588228 cri.go:89] found id: ""
	I1217 21:31:23.193440  588228 logs.go:282] 0 containers: []
	W1217 21:31:23.193453  588228 logs.go:284] No container was found matching "kube-apiserver"
	I1217 21:31:23.193460  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1217 21:31:23.193520  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1217 21:31:23.223314  588228 cri.go:89] found id: ""
	I1217 21:31:23.223349  588228 logs.go:282] 0 containers: []
	W1217 21:31:23.223358  588228 logs.go:284] No container was found matching "etcd"
	I1217 21:31:23.223365  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1217 21:31:23.223430  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1217 21:31:23.252042  588228 cri.go:89] found id: ""
	I1217 21:31:23.252070  588228 logs.go:282] 0 containers: []
	W1217 21:31:23.252079  588228 logs.go:284] No container was found matching "coredns"
	I1217 21:31:23.252085  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1217 21:31:23.252147  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1217 21:31:23.281849  588228 cri.go:89] found id: ""
	I1217 21:31:23.281877  588228 logs.go:282] 0 containers: []
	W1217 21:31:23.281891  588228 logs.go:284] No container was found matching "kube-scheduler"
	I1217 21:31:23.281898  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1217 21:31:23.281958  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1217 21:31:23.312201  588228 cri.go:89] found id: ""
	I1217 21:31:23.312312  588228 logs.go:282] 0 containers: []
	W1217 21:31:23.312336  588228 logs.go:284] No container was found matching "kube-proxy"
	I1217 21:31:23.312354  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1217 21:31:23.312444  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1217 21:31:23.347018  588228 cri.go:89] found id: ""
	I1217 21:31:23.347042  588228 logs.go:282] 0 containers: []
	W1217 21:31:23.347050  588228 logs.go:284] No container was found matching "kube-controller-manager"
	I1217 21:31:23.347057  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1217 21:31:23.347113  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1217 21:31:23.382404  588228 cri.go:89] found id: ""
	I1217 21:31:23.382427  588228 logs.go:282] 0 containers: []
	W1217 21:31:23.382435  588228 logs.go:284] No container was found matching "kindnet"
	I1217 21:31:23.382441  588228 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1217 21:31:23.382498  588228 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1217 21:31:23.415326  588228 cri.go:89] found id: ""
	I1217 21:31:23.415349  588228 logs.go:282] 0 containers: []
	W1217 21:31:23.415357  588228 logs.go:284] No container was found matching "storage-provisioner"
	I1217 21:31:23.415367  588228 logs.go:123] Gathering logs for kubelet ...
	I1217 21:31:23.415378  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1217 21:31:23.478808  588228 logs.go:123] Gathering logs for dmesg ...
	I1217 21:31:23.478848  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1217 21:31:23.494510  588228 logs.go:123] Gathering logs for describe nodes ...
	I1217 21:31:23.494539  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1217 21:31:23.569772  588228 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1217 21:31:23.569797  588228 logs.go:123] Gathering logs for containerd ...
	I1217 21:31:23.569810  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1217 21:31:23.614990  588228 logs.go:123] Gathering logs for container status ...
	I1217 21:31:23.615033  588228 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1217 21:31:23.651955  588228 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000869511s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1217 21:31:23.652001  588228 out.go:285] * 
	W1217 21:31:23.652076  588228 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000869511s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1217 21:31:23.652096  588228 out.go:285] * 
	W1217 21:31:23.654399  588228 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1217 21:31:23.659480  588228 out.go:203] 
	W1217 21:31:23.663155  588228 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000869511s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1217 21:31:23.663260  588228 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1217 21:31:23.663322  588228 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1217 21:31:23.666523  588228 out.go:203] 
	W1217 21:31:19.287602  638545 node_ready.go:57] node "custom-flannel-675779" has "Ready":"False" status (will retry)
	I1217 21:31:21.286699  638545 node_ready.go:49] node "custom-flannel-675779" is "Ready"
	I1217 21:31:21.286727  638545 node_ready.go:38] duration metric: took 4.003007358s for node "custom-flannel-675779" to be "Ready" ...
	I1217 21:31:21.286742  638545 api_server.go:52] waiting for apiserver process to appear ...
	I1217 21:31:21.286812  638545 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:31:21.299000  638545 api_server.go:72] duration metric: took 4.86978168s to wait for apiserver process to appear ...
	I1217 21:31:21.299029  638545 api_server.go:88] waiting for apiserver healthz status ...
	I1217 21:31:21.299050  638545 api_server.go:253] Checking apiserver healthz at https://192.168.85.2:8443/healthz ...
	I1217 21:31:21.307078  638545 api_server.go:279] https://192.168.85.2:8443/healthz returned 200:
	ok
	I1217 21:31:21.308413  638545 api_server.go:141] control plane version: v1.34.3
	I1217 21:31:21.308447  638545 api_server.go:131] duration metric: took 9.40816ms to wait for apiserver health ...
	I1217 21:31:21.308456  638545 system_pods.go:43] waiting for kube-system pods to appear ...
	I1217 21:31:21.311760  638545 system_pods.go:59] 7 kube-system pods found
	I1217 21:31:21.311799  638545 system_pods.go:61] "coredns-66bc5c9577-lnkwl" [be0caf42-9df2-4aea-9437-07865676353f] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1217 21:31:21.311807  638545 system_pods.go:61] "etcd-custom-flannel-675779" [ac7477a0-3879-4cec-b0a1-b43642184ddb] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I1217 21:31:21.311813  638545 system_pods.go:61] "kube-apiserver-custom-flannel-675779" [2b8ae0f8-eee7-4053-8812-da4818e0ebdf] Running
	I1217 21:31:21.311818  638545 system_pods.go:61] "kube-controller-manager-custom-flannel-675779" [02b11901-9dfe-41b3-9eff-e0248b069bc3] Running
	I1217 21:31:21.311822  638545 system_pods.go:61] "kube-proxy-ns62f" [dd93017d-5047-4e64-99d8-f501b7d4ea5f] Running
	I1217 21:31:21.311826  638545 system_pods.go:61] "kube-scheduler-custom-flannel-675779" [897eccb8-9c3e-412e-aab4-4a574e3f669a] Running
	I1217 21:31:21.311831  638545 system_pods.go:61] "storage-provisioner" [cd8e3f7d-5ab9-45db-bc13-26ac5ccb1d92] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1217 21:31:21.311836  638545 system_pods.go:74] duration metric: took 3.374749ms to wait for pod list to return data ...
	I1217 21:31:21.311851  638545 default_sa.go:34] waiting for default service account to be created ...
	I1217 21:31:21.315081  638545 default_sa.go:45] found service account: "default"
	I1217 21:31:21.315114  638545 default_sa.go:55] duration metric: took 3.25775ms for default service account to be created ...
	I1217 21:31:21.315126  638545 system_pods.go:116] waiting for k8s-apps to be running ...
	I1217 21:31:21.318124  638545 system_pods.go:86] 7 kube-system pods found
	I1217 21:31:21.318155  638545 system_pods.go:89] "coredns-66bc5c9577-lnkwl" [be0caf42-9df2-4aea-9437-07865676353f] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1217 21:31:21.318166  638545 system_pods.go:89] "etcd-custom-flannel-675779" [ac7477a0-3879-4cec-b0a1-b43642184ddb] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I1217 21:31:21.318177  638545 system_pods.go:89] "kube-apiserver-custom-flannel-675779" [2b8ae0f8-eee7-4053-8812-da4818e0ebdf] Running
	I1217 21:31:21.318186  638545 system_pods.go:89] "kube-controller-manager-custom-flannel-675779" [02b11901-9dfe-41b3-9eff-e0248b069bc3] Running
	I1217 21:31:21.318190  638545 system_pods.go:89] "kube-proxy-ns62f" [dd93017d-5047-4e64-99d8-f501b7d4ea5f] Running
	I1217 21:31:21.318198  638545 system_pods.go:89] "kube-scheduler-custom-flannel-675779" [897eccb8-9c3e-412e-aab4-4a574e3f669a] Running
	I1217 21:31:21.318209  638545 system_pods.go:89] "storage-provisioner" [cd8e3f7d-5ab9-45db-bc13-26ac5ccb1d92] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1217 21:31:21.318239  638545 retry.go:31] will retry after 268.350268ms: missing components: kube-dns
	I1217 21:31:21.594847  638545 system_pods.go:86] 7 kube-system pods found
	I1217 21:31:21.594886  638545 system_pods.go:89] "coredns-66bc5c9577-lnkwl" [be0caf42-9df2-4aea-9437-07865676353f] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1217 21:31:21.594896  638545 system_pods.go:89] "etcd-custom-flannel-675779" [ac7477a0-3879-4cec-b0a1-b43642184ddb] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I1217 21:31:21.594902  638545 system_pods.go:89] "kube-apiserver-custom-flannel-675779" [2b8ae0f8-eee7-4053-8812-da4818e0ebdf] Running
	I1217 21:31:21.594907  638545 system_pods.go:89] "kube-controller-manager-custom-flannel-675779" [02b11901-9dfe-41b3-9eff-e0248b069bc3] Running
	I1217 21:31:21.594911  638545 system_pods.go:89] "kube-proxy-ns62f" [dd93017d-5047-4e64-99d8-f501b7d4ea5f] Running
	I1217 21:31:21.594915  638545 system_pods.go:89] "kube-scheduler-custom-flannel-675779" [897eccb8-9c3e-412e-aab4-4a574e3f669a] Running
	I1217 21:31:21.594921  638545 system_pods.go:89] "storage-provisioner" [cd8e3f7d-5ab9-45db-bc13-26ac5ccb1d92] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1217 21:31:21.594935  638545 retry.go:31] will retry after 346.827421ms: missing components: kube-dns
	I1217 21:31:21.946983  638545 system_pods.go:86] 7 kube-system pods found
	I1217 21:31:21.947020  638545 system_pods.go:89] "coredns-66bc5c9577-lnkwl" [be0caf42-9df2-4aea-9437-07865676353f] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1217 21:31:21.947029  638545 system_pods.go:89] "etcd-custom-flannel-675779" [ac7477a0-3879-4cec-b0a1-b43642184ddb] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I1217 21:31:21.947038  638545 system_pods.go:89] "kube-apiserver-custom-flannel-675779" [2b8ae0f8-eee7-4053-8812-da4818e0ebdf] Running
	I1217 21:31:21.947043  638545 system_pods.go:89] "kube-controller-manager-custom-flannel-675779" [02b11901-9dfe-41b3-9eff-e0248b069bc3] Running
	I1217 21:31:21.947047  638545 system_pods.go:89] "kube-proxy-ns62f" [dd93017d-5047-4e64-99d8-f501b7d4ea5f] Running
	I1217 21:31:21.947052  638545 system_pods.go:89] "kube-scheduler-custom-flannel-675779" [897eccb8-9c3e-412e-aab4-4a574e3f669a] Running
	I1217 21:31:21.947058  638545 system_pods.go:89] "storage-provisioner" [cd8e3f7d-5ab9-45db-bc13-26ac5ccb1d92] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1217 21:31:21.947072  638545 retry.go:31] will retry after 347.57623ms: missing components: kube-dns
	I1217 21:31:22.299059  638545 system_pods.go:86] 7 kube-system pods found
	I1217 21:31:22.299094  638545 system_pods.go:89] "coredns-66bc5c9577-lnkwl" [be0caf42-9df2-4aea-9437-07865676353f] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1217 21:31:22.299103  638545 system_pods.go:89] "etcd-custom-flannel-675779" [ac7477a0-3879-4cec-b0a1-b43642184ddb] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I1217 21:31:22.299110  638545 system_pods.go:89] "kube-apiserver-custom-flannel-675779" [2b8ae0f8-eee7-4053-8812-da4818e0ebdf] Running
	I1217 21:31:22.299116  638545 system_pods.go:89] "kube-controller-manager-custom-flannel-675779" [02b11901-9dfe-41b3-9eff-e0248b069bc3] Running
	I1217 21:31:22.299120  638545 system_pods.go:89] "kube-proxy-ns62f" [dd93017d-5047-4e64-99d8-f501b7d4ea5f] Running
	I1217 21:31:22.299125  638545 system_pods.go:89] "kube-scheduler-custom-flannel-675779" [897eccb8-9c3e-412e-aab4-4a574e3f669a] Running
	I1217 21:31:22.299129  638545 system_pods.go:89] "storage-provisioner" [cd8e3f7d-5ab9-45db-bc13-26ac5ccb1d92] Running
	I1217 21:31:22.299145  638545 retry.go:31] will retry after 401.532257ms: missing components: kube-dns
	I1217 21:31:22.704477  638545 system_pods.go:86] 7 kube-system pods found
	I1217 21:31:22.704513  638545 system_pods.go:89] "coredns-66bc5c9577-lnkwl" [be0caf42-9df2-4aea-9437-07865676353f] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1217 21:31:22.704523  638545 system_pods.go:89] "etcd-custom-flannel-675779" [ac7477a0-3879-4cec-b0a1-b43642184ddb] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I1217 21:31:22.704530  638545 system_pods.go:89] "kube-apiserver-custom-flannel-675779" [2b8ae0f8-eee7-4053-8812-da4818e0ebdf] Running
	I1217 21:31:22.704535  638545 system_pods.go:89] "kube-controller-manager-custom-flannel-675779" [02b11901-9dfe-41b3-9eff-e0248b069bc3] Running
	I1217 21:31:22.704540  638545 system_pods.go:89] "kube-proxy-ns62f" [dd93017d-5047-4e64-99d8-f501b7d4ea5f] Running
	I1217 21:31:22.704546  638545 system_pods.go:89] "kube-scheduler-custom-flannel-675779" [897eccb8-9c3e-412e-aab4-4a574e3f669a] Running
	I1217 21:31:22.704550  638545 system_pods.go:89] "storage-provisioner" [cd8e3f7d-5ab9-45db-bc13-26ac5ccb1d92] Running
	I1217 21:31:22.704564  638545 retry.go:31] will retry after 624.393718ms: missing components: kube-dns
	I1217 21:31:23.335083  638545 system_pods.go:86] 7 kube-system pods found
	I1217 21:31:23.335116  638545 system_pods.go:89] "coredns-66bc5c9577-lnkwl" [be0caf42-9df2-4aea-9437-07865676353f] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1217 21:31:23.335124  638545 system_pods.go:89] "etcd-custom-flannel-675779" [ac7477a0-3879-4cec-b0a1-b43642184ddb] Running
	I1217 21:31:23.335131  638545 system_pods.go:89] "kube-apiserver-custom-flannel-675779" [2b8ae0f8-eee7-4053-8812-da4818e0ebdf] Running
	I1217 21:31:23.335136  638545 system_pods.go:89] "kube-controller-manager-custom-flannel-675779" [02b11901-9dfe-41b3-9eff-e0248b069bc3] Running
	I1217 21:31:23.335140  638545 system_pods.go:89] "kube-proxy-ns62f" [dd93017d-5047-4e64-99d8-f501b7d4ea5f] Running
	I1217 21:31:23.335144  638545 system_pods.go:89] "kube-scheduler-custom-flannel-675779" [897eccb8-9c3e-412e-aab4-4a574e3f669a] Running
	I1217 21:31:23.335147  638545 system_pods.go:89] "storage-provisioner" [cd8e3f7d-5ab9-45db-bc13-26ac5ccb1d92] Running
	I1217 21:31:23.335163  638545 retry.go:31] will retry after 756.140423ms: missing components: kube-dns
	I1217 21:31:24.095907  638545 system_pods.go:86] 7 kube-system pods found
	I1217 21:31:24.095939  638545 system_pods.go:89] "coredns-66bc5c9577-lnkwl" [be0caf42-9df2-4aea-9437-07865676353f] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1217 21:31:24.095946  638545 system_pods.go:89] "etcd-custom-flannel-675779" [ac7477a0-3879-4cec-b0a1-b43642184ddb] Running
	I1217 21:31:24.095953  638545 system_pods.go:89] "kube-apiserver-custom-flannel-675779" [2b8ae0f8-eee7-4053-8812-da4818e0ebdf] Running
	I1217 21:31:24.095957  638545 system_pods.go:89] "kube-controller-manager-custom-flannel-675779" [02b11901-9dfe-41b3-9eff-e0248b069bc3] Running
	I1217 21:31:24.095961  638545 system_pods.go:89] "kube-proxy-ns62f" [dd93017d-5047-4e64-99d8-f501b7d4ea5f] Running
	I1217 21:31:24.095965  638545 system_pods.go:89] "kube-scheduler-custom-flannel-675779" [897eccb8-9c3e-412e-aab4-4a574e3f669a] Running
	I1217 21:31:24.095970  638545 system_pods.go:89] "storage-provisioner" [cd8e3f7d-5ab9-45db-bc13-26ac5ccb1d92] Running
	I1217 21:31:24.095985  638545 retry.go:31] will retry after 783.754378ms: missing components: kube-dns
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 17 21:23:16 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:23:16.317120332Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 21:23:16 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:23:16.318839085Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.13.1\" with image id \"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\", repo tag \"registry.k8s.io/coredns/coredns:v1.13.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\", size \"21168808\" in 1.808927556s"
	Dec 17 21:23:16 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:23:16.318984793Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\" returns image reference \"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\""
	Dec 17 21:23:16 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:23:16.322260106Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\""
	Dec 17 21:23:16 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:23:16.958035426Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}"
	Dec 17 21:23:16 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:23:16.959962698Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=268709"
	Dec 17 21:23:16 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:23:16.962570271Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}"
	Dec 17 21:23:16 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:23:16.966336543Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}"
	Dec 17 21:23:16 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:23:16.967310887Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 644.885085ms"
	Dec 17 21:23:16 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:23:16.967468820Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\""
	Dec 17 21:23:16 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:23:16.968390848Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\""
	Dec 17 21:23:18 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:23:18.574316470Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.6-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 21:23:18 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:23:18.576194338Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.6-0: active requests=0, bytes read=21753021"
	Dec 17 21:23:18 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:23:18.578639448Z" level=info msg="ImageCreate event name:\"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 21:23:18 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:23:18.582726182Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 17 21:23:18 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:23:18.584143270Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.6-0\" with image id \"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\", repo tag \"registry.k8s.io/etcd:3.6.6-0\", repo digest \"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\", size \"21749640\" in 1.615700287s"
	Dec 17 21:23:18 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:23:18.584186487Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\" returns image reference \"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\""
	Dec 17 21:28:08 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:28:08.168557513Z" level=info msg="container event discarded" container=f78e467d6d67870a45ce591688f2e9b68a717af374150cd036960217005ab501 type=CONTAINER_DELETED_EVENT
	Dec 17 21:28:08 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:28:08.184314889Z" level=info msg="container event discarded" container=64b4afddf51e66c1b926de0568ec51193df3065ecafd826e507852f826563542 type=CONTAINER_DELETED_EVENT
	Dec 17 21:28:08 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:28:08.194689915Z" level=info msg="container event discarded" container=f95ef6f56b9794f38f4949b78fe16b00037ce5fa73e44c2c9a896f0ded77b85d type=CONTAINER_DELETED_EVENT
	Dec 17 21:28:08 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:28:08.194751642Z" level=info msg="container event discarded" container=d437f9493807d6f6bd9b5e58709246591e9419cfd8008f86e3bb406e9c3ea15d type=CONTAINER_DELETED_EVENT
	Dec 17 21:28:08 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:28:08.215158855Z" level=info msg="container event discarded" container=eb034432e82e107044546955cf75d2fcfa97c511604210a884bf1d57bf1fa2bd type=CONTAINER_DELETED_EVENT
	Dec 17 21:28:08 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:28:08.215214479Z" level=info msg="container event discarded" container=a669180280de641ee626925e66576a7abc0bdcd3eef6c0b57529b4cea3bb9731 type=CONTAINER_DELETED_EVENT
	Dec 17 21:28:08 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:28:08.232590784Z" level=info msg="container event discarded" container=0a817fd13d55a4ce1907c5fd0fbc27707909c765a01bb501a864e8ec907168df type=CONTAINER_DELETED_EVENT
	Dec 17 21:28:08 kubernetes-upgrade-332113 containerd[593]: time="2025-12-17T21:28:08.232652537Z" level=info msg="container event discarded" container=27c4220075696c53e51ec6d0ea8afa0345f8a64fec4d2152cd6b2fbeafc97664 type=CONTAINER_DELETED_EVENT
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec17 19:37] hrtimer: interrupt took 15014583 ns
	[Dec17 19:39] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:06] kauditd_printk_skb: 8 callbacks suppressed
	[Dec17 20:17] FS-Cache: Duplicate cookie detected
	[  +0.000767] FS-Cache: O-cookie c=00000031 [p=00000002 fl=222 nc=0 na=1]
	[  +0.001036] FS-Cache: O-cookie d=00000000b1f70094{9P.session} n=000000004124fba5
	[  +0.001177] FS-Cache: O-key=[10] '34323937353834383437'
	[  +0.000816] FS-Cache: N-cookie c=00000032 [p=00000002 fl=2 nc=0 na=1]
	[  +0.001043] FS-Cache: N-cookie d=00000000b1f70094{9P.session} n=000000009cece4cf
	[  +0.001160] FS-Cache: N-key=[10] '34323937353834383437'
	
	
	==> kernel <==
	 21:31:25 up  4:13,  0 user,  load average: 3.51, 2.40, 2.42
	Linux kubernetes-upgrade-332113 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 17 21:31:22 kubernetes-upgrade-332113 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 21:31:22 kubernetes-upgrade-332113 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 319.
	Dec 17 21:31:22 kubernetes-upgrade-332113 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 21:31:22 kubernetes-upgrade-332113 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 21:31:23 kubernetes-upgrade-332113 kubelet[14356]: E1217 21:31:23.034924   14356 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 21:31:23 kubernetes-upgrade-332113 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 21:31:23 kubernetes-upgrade-332113 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 21:31:23 kubernetes-upgrade-332113 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 320.
	Dec 17 21:31:23 kubernetes-upgrade-332113 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 21:31:23 kubernetes-upgrade-332113 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 21:31:23 kubernetes-upgrade-332113 kubelet[14450]: E1217 21:31:23.805799   14450 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 21:31:23 kubernetes-upgrade-332113 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 21:31:23 kubernetes-upgrade-332113 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 21:31:24 kubernetes-upgrade-332113 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 321.
	Dec 17 21:31:24 kubernetes-upgrade-332113 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 21:31:24 kubernetes-upgrade-332113 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 21:31:24 kubernetes-upgrade-332113 kubelet[14456]: E1217 21:31:24.554779   14456 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 21:31:24 kubernetes-upgrade-332113 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 21:31:24 kubernetes-upgrade-332113 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 17 21:31:25 kubernetes-upgrade-332113 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 322.
	Dec 17 21:31:25 kubernetes-upgrade-332113 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 21:31:25 kubernetes-upgrade-332113 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 17 21:31:25 kubernetes-upgrade-332113 kubelet[14543]: E1217 21:31:25.303694   14543 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 17 21:31:25 kubernetes-upgrade-332113 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 17 21:31:25 kubernetes-upgrade-332113 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p kubernetes-upgrade-332113 -n kubernetes-upgrade-332113
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p kubernetes-upgrade-332113 -n kubernetes-upgrade-332113: exit status 2 (334.705803ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "kubernetes-upgrade-332113" apiserver is not running, skipping kubectl commands (state="Stopped")
helpers_test.go:176: Cleaning up "kubernetes-upgrade-332113" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p kubernetes-upgrade-332113
E1217 21:31:28.507536  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p kubernetes-upgrade-332113: (2.772589743s)
--- FAIL: TestKubernetesUpgrade (794.69s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/AddonExistsAfterStop (7200.07s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/AddonExistsAfterStop
start_stop_delete_test.go:285: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1217 22:04:36.456573  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1217 22:04:40.591000  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/bridge-675779/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1217 22:04:48.967892  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/auto-675779/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 22:04:49.015507  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1217 22:04:58.826550  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/old-k8s-version-247817/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1217 22:05:02.034586  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/calico-675779/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1217 22:06:06.942009  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/kindnet-675779/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1217 22:06:15.851676  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/default-k8s-diff-port-479888/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1217 22:06:28.507635  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1217 22:06:37.451855  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/custom-flannel-675779/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1217 22:06:45.902765  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/auto-675779/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
panic: test timed out after 2h0m0s
	running tests:
		TestStartStop (42m30s)
		TestStartStop/group/no-preload (29m27s)
		TestStartStop/group/no-preload/serial (29m27s)
		TestStartStop/group/no-preload/serial/AddonExistsAfterStop (4m5s)

                                                
                                                
goroutine 6735 [running]:
testing.(*M).startAlarm.func1()
	/usr/local/go/src/testing/testing.go:2682 +0x2b0
created by time.goFunc
	/usr/local/go/src/time/sleep.go:215 +0x38

                                                
                                                
goroutine 1 [chan receive, 11 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1891 +0x3d0
testing.tRunner(0x4000365c00, 0x40006f9bb8)
	/usr/local/go/src/testing/testing.go:1940 +0x104
testing.runTests(0x40006d0060, {0x534c680, 0x2c, 0x2c}, {0x40006f9d08?, 0x125774?, 0x53750c0?})
	/usr/local/go/src/testing/testing.go:2475 +0x3b8
testing.(*M).Run(0x40006ad400)
	/usr/local/go/src/testing/testing.go:2337 +0x530
k8s.io/minikube/test/integration.TestMain(0x40006ad400)
	/home/jenkins/workspace/Build_Cross/test/integration/main_test.go:64 +0xf0
main.main()
	_testmain.go:133 +0x88

                                                
                                                
goroutine 2134 [chan send, 80 minutes]:
os/exec.(*Cmd).watchCtx(0x400141ea80, 0x4004ef1110)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 2133
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 3819 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0x4001c56e90, 0x1a)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4001c56e80)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001363920)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4004ef1260?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000084230?}, 0x40012fb6a8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000084230}, 0x40000d8f38, {0x369e4a0, 0x40015c0f60}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x40012fb7a8?, {0x369e4a0?, 0x40015c0f60?}, 0x0?, 0x4000662910?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001a1eb00, 0x3b9aca00, 0x0, 0x1, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 3829
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 162 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x400023c080?}, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 154
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 155 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0x4004f1e1d0, 0x2d)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4004f1e1c0)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4004f5c660)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40004dcc40?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000084230?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000084230}, 0x40000d2f38, {0x369e4a0, 0x40004c1b60}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x0?, {0x369e4a0?, 0x40004c1b60?}, 0x30?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001511080, 0x3b9aca00, 0x0, 0x1, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 163
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 4053 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000084230}, 0x40012f8740, 0x40017e9f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000084230}, 0xaf?, 0x40012f8740, 0x40012f8788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000084230?}, 0x0?, 0x40012f8750?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x36f4250?, 0x400023c080?, 0x400150e8c0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4050
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 3325 [chan receive, 43 minutes]:
testing.(*T).Run(0x40014b3880, {0x296d71f?, 0x40000d3f58?}, 0x339bcf8)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop(0x40014b3880)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:46 +0x3c
testing.tRunner(0x40014b3880, 0x339bb10)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 1
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 156 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000084230}, 0x40000a7740, 0x40000d7f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000084230}, 0xbb?, 0x40000a7740, 0x40000a7788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000084230?}, 0x0?, 0x40000a7750?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x36f4250?, 0x400023c080?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 163
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 163 [chan receive, 117 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4004f5c660, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 154
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 157 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 156
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 895 [select]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 894
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 893 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0x4001c57050, 0x2c)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4001c57040)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001741200)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40002be000?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000084230?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000084230}, 0x40013c0f38, {0x369e4a0, 0x40016719b0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x36f4250?, {0x369e4a0?, 0x40016719b0?}, 0x60?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40016648d0, 0x3b9aca00, 0x0, 0x1, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 918
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 2165 [chan send, 80 minutes]:
os/exec.(*Cmd).watchCtx(0x4001415680, 0x40014ff3b0)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 2164
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 4050 [chan receive, 37 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40006e7200, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 4000
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 4520 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 4519
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 1209 [select, 111 minutes]:
net/http.(*persistConn).writeLoop(0x4001c28360)
	/usr/local/go/src/net/http/transport.go:2600 +0x94
created by net/http.(*Transport).dialConn in goroutine 1206
	/usr/local/go/src/net/http/transport.go:1948 +0x1164

                                                
                                                
goroutine 3821 [select]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3820
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 4202 [select]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000084230}, 0x40014a0740, 0x40006eff88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000084230}, 0xf8?, 0x40014a0740, 0x40014a0788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000084230?}, 0x40014fc300?, 0x4001bafb80?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x4001b76850?, 0x4001c74560?, 0x40013dddc0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4198
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 4201 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0x4001c7a750, 0x18)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4001c7a740)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001c350e0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4001f1c690?, 0x54bdd8?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000084230?}, 0x40014a3ef8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000084230}, 0x40013c2f38, {0x369e4a0, 0x4001e81d70}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x0?, {0x369e4a0?, 0x4001e81d70?}, 0x5c?, 0x400141f800?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4004f07730, 0x3b9aca00, 0x0, 0x1, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4198
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 1708 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000084230}, 0x40014f2f40, 0x40014f2f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000084230}, 0xf0?, 0x40014f2f40, 0x40014f2f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000084230?}, 0x0?, 0x40012fc750?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x4001415380?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 1756
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 1709 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 1708
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 1352 [IO wait, 111 minutes]:
internal/poll.runtime_pollWait(0xffff50407600, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x4001945900?, 0xdbd0c?, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0x4001945900)
	/usr/local/go/src/internal/poll/fd_unix.go:613 +0x21c
net.(*netFD).accept(0x4001945900)
	/usr/local/go/src/net/fd_unix.go:161 +0x28
net.(*TCPListener).accept(0x4001c7b1c0)
	/usr/local/go/src/net/tcpsock_posix.go:159 +0x24
net.(*TCPListener).Accept(0x4001c7b1c0)
	/usr/local/go/src/net/tcpsock.go:380 +0x2c
net/http.(*Server).Serve(0x4001477e00, {0x36d3f80, 0x4001c7b1c0})
	/usr/local/go/src/net/http/server.go:3463 +0x24c
net/http.(*Server).ListenAndServe(0x4001477e00)
	/usr/local/go/src/net/http/server.go:3389 +0x80
k8s.io/minikube/test/integration.startHTTPProxy.func1(...)
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2218
created by k8s.io/minikube/test/integration.startHTTPProxy in goroutine 1350
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2217 +0x104

                                                
                                                
goroutine 3749 [chan receive, 11 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1891 +0x3d0
testing.tRunner(0x4004e93dc0, 0x339bcf8)
	/usr/local/go/src/testing/testing.go:1940 +0x104
created by testing.(*T).Run in goroutine 3325
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 3828 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x400023c080?}, 0x40013dd6c0?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 3808
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 1155 [chan send, 111 minutes]:
os/exec.(*Cmd).watchCtx(0x4001b4ac00, 0x40019f39d0)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1154
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 3820 [select]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000084230}, 0x40012fa740, 0x40012fa788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000084230}, 0x48?, 0x40012fa740, 0x40012fa788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000084230?}, 0x4001bd8d80?, 0x40001643c0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x4001414900?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 3829
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 4298 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x400023c080?}, 0x4001602900?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 4294
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 716 [IO wait, 113 minutes]:
internal/poll.runtime_pollWait(0xffff50407200, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x4001354080?, 0x2d970?, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0x4001354080)
	/usr/local/go/src/internal/poll/fd_unix.go:613 +0x21c
net.(*netFD).accept(0x4001354080)
	/usr/local/go/src/net/fd_unix.go:161 +0x28
net.(*TCPListener).accept(0x4001962a40)
	/usr/local/go/src/net/tcpsock_posix.go:159 +0x24
net.(*TCPListener).Accept(0x4001962a40)
	/usr/local/go/src/net/tcpsock.go:380 +0x2c
net/http.(*Server).Serve(0x4001948300, {0x36d3f80, 0x4001962a40})
	/usr/local/go/src/net/http/server.go:3463 +0x24c
net/http.(*Server).ListenAndServe(0x4001948300)
	/usr/local/go/src/net/http/server.go:3389 +0x80
k8s.io/minikube/test/integration.startHTTPProxy.func1(...)
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2218
created by k8s.io/minikube/test/integration.startHTTPProxy in goroutine 714
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2217 +0x104

                                                
                                                
goroutine 3926 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3925
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 3919 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x400023c080?}, 0x40013dc1c0?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 3918
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 1089 [chan send, 111 minutes]:
os/exec.(*Cmd).watchCtx(0x4001a26780, 0x4001a2e310)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1056
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 1208 [select, 111 minutes]:
net/http.(*persistConn).readLoop(0x4001c28360)
	/usr/local/go/src/net/http/transport.go:2398 +0xa6c
created by net/http.(*Transport).dialConn in goroutine 1206
	/usr/local/go/src/net/http/transport.go:1947 +0x111c

                                                
                                                
goroutine 4052 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0x40003df350, 0x18)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40003df340)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40006e7200)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40014ff880?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000084230?}, 0xd?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000084230}, 0x4001382f38, {0x369e4a0, 0x4001c93440}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x40000a17a8?, {0x369e4a0?, 0x4001c93440?}, 0x30?, 0x161f90?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40018ff860, 0x3b9aca00, 0x0, 0x1, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4050
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 2225 [chan send, 80 minutes]:
os/exec.(*Cmd).watchCtx(0x4000332c00, 0x40004dc540)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1512
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 1132 [chan send, 111 minutes]:
os/exec.(*Cmd).watchCtx(0x4001bbc600, 0x4001a2fb90)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 852
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 894 [select]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000084230}, 0x40012ff740, 0x40013b4f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000084230}, 0x18?, 0x40012ff740, 0x40012ff788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000084230?}, 0x4001cd1080?, 0x4001c8de00?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x400147ac00?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 918
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 918 [chan receive, 111 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001741200, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 916
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 4054 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 4053
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 4198 [chan receive, 35 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001c350e0, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 4193
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 917 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x400023c080?}, 0x40004a2180?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 916
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 4197 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x400023c080?}, 0x400150f6c0?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 4193
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 4203 [select]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 4202
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 3925 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000084230}, 0x40006f0f40, 0x40006f0f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000084230}, 0xf0?, 0x40006f0f40, 0x40006f0f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000084230?}, 0x36e6598?, 0x4001cea5b0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x40014fc600?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 3920
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 4299 [chan receive, 35 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x400147de00, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 4294
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 3829 [chan receive, 41 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001363920, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3808
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 3753 [chan receive, 30 minutes]:
testing.(*T).Run(0x40014de1c0, {0x296eb91?, 0x0?}, 0x40018c9300)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop.func1.1(0x40014de1c0)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:128 +0x7e4
testing.tRunner(0x40014de1c0, 0x4001962200)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3749
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 1707 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0x4001c56690, 0x24)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4001c56680)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001741a40)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40012f8718?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000084230?}, 0x40012f86a8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000084230}, 0x40014eef38, {0x369e4a0, 0x40016600c0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x36f4250?, {0x369e4a0?, 0x40016600c0?}, 0xe0?, 0x4001d07080?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001510020, 0x3b9aca00, 0x0, 0x1, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 1756
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 1755 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x400023c080?}, 0x40014b28c0?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 1754
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 1756 [chan receive, 82 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001741a40, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 1754
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 4049 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x400023c080?}, 0x400150e8c0?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 4000
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 4903 [chan receive, 4 minutes]:
testing.(*T).Run(0x4001746a80, {0x2994252?, 0x40000006ee?}, 0x4001354000)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop.func1.1.1(0x4001746a80)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:153 +0x1b8
testing.tRunner(0x4001746a80, 0x40018c9300)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3753
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 4690 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x400023c080?}, 0x4001b77340?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 4686
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 4538 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x400023c080?}, 0x4001c4c540?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 4534
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 3920 [chan receive, 39 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40016e08a0, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3918
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 3924 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0x4001962a10, 0x19)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4001962a00)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40016e08a0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40014fe7e0?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000084230?}, 0x40014a6ea8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000084230}, 0x40014f0f38, {0x369e4a0, 0x4001616060}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x40014a6fa8?, {0x369e4a0?, 0x4001616060?}, 0xe0?, 0x400141e480?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40008ae020, 0x3b9aca00, 0x0, 0x1, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 3920
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 4370 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 4369
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 4302 [sync.Cond.Wait, 5 minutes]:
sync.runtime_notifyListWait(0x4001c56990, 0x17)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4001c56980)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x400147de00)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x400039f0a0?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000084230?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000084230}, 0x4001391f38, {0x369e4a0, 0x40012f5e00}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x36f4250?, {0x369e4a0?, 0x40012f5e00?}, 0xd0?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001a1e0a0, 0x3b9aca00, 0x0, 0x1, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4299
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 4303 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000084230}, 0x400156cf40, 0x400156cf88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000084230}, 0x88?, 0x400156cf40, 0x400156cf88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000084230?}, 0x161f90?, 0x4004e936c0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x40013dc540?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4299
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 4304 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 4303
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 4369 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000084230}, 0x4001571740, 0x40013aef88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000084230}, 0xe0?, 0x4001571740, 0x4001571788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000084230?}, 0x36e6598?, 0x40014fe9a0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x40014b2540?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4367
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 4368 [sync.Cond.Wait, 5 minutes]:
sync.runtime_notifyListWait(0x4000765e90, 0x17)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4000765e80)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001ba3920)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40000a76c8?, 0x2ab14?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000084230?}, 0x40000a76a8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000084230}, 0x400138bf38, {0x369e4a0, 0x40012e1830}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x40000a77a8?, {0x369e4a0?, 0x40012e1830?}, 0x10?, 0x4001a36480?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001f12130, 0x3b9aca00, 0x0, 0x1, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4367
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 4367 [chan receive, 35 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001ba3920, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 4365
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 4366 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x400023c080?}, 0x4001a36900?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 4365
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 4539 [chan receive, 34 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40015b8840, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 4534
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 4571 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x400023c080?}, 0x40014b2700?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 4578
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 4572 [chan receive, 32 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001c44e40, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 4578
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 4583 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000084230}, 0x40000a7740, 0x4001385f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000084230}, 0x68?, 0x40000a7740, 0x40000a7788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000084230?}, 0x4001a36480?, 0x4001baf680?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x40018e7380?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4572
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 4519 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000084230}, 0x40000d6f40, 0x40000d6f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000084230}, 0x98?, 0x40000d6f40, 0x40000d6f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000084230?}, 0x400141fb00?, 0x40004aaf00?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x40018e7500?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4539
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 4518 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0x4001c56390, 0x17)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4001c56380)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40015b8840)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4001a2f490?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000084230?}, 0x40012fbea8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000084230}, 0x400138ff38, {0x369e4a0, 0x4001668270}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x40012fbfa8?, {0x369e4a0?, 0x4001668270?}, 0x70?, 0x40017bad80?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x400167e350, 0x3b9aca00, 0x0, 0x1, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4539
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 4695 [select]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000084230}, 0x40012fbf40, 0x4001388f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000084230}, 0x2b?, 0x40012fbf40, 0x40012fbf88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000084230?}, 0x0?, 0x40012fbf50?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x36f4250?, 0x400023c080?, 0x4001b77340?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4691
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 4696 [select]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 4695
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 4694 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0x4001954810, 0x17)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4001954800)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001740e40)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4001a2e380?, 0x1e27c?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000084230?}, 0x4001570ec8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000084230}, 0x40013c1f38, {0x369e4a0, 0x4001f1a3c0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x4004ef0540?, {0x369e4a0?, 0x4001f1a3c0?}, 0x10?, 0x4001b64c00?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40014e89a0, 0x3b9aca00, 0x0, 0x1, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4691
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 4584 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 4583
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 4582 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0x4001c56dd0, 0x17)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4001c56dc0)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001c44e40)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4001f1ca80?, 0x2d6d6f747375632f?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000084230?}, 0x77656e205d33353a?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000084230}, 0x40014f5f38, {0x369e4a0, 0x40012ed3b0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x3435383336202031?, {0x369e4a0?, 0x40012ed3b0?}, 0x31?, 0x622f6e69622f203a?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001f0a760, 0x3b9aca00, 0x0, 0x1, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4572
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 6229 [select]:
k8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext({0x36e6528, 0x4004f19bd0}, {0x36d45e0, 0x4001c2bb60}, 0x1, 0x0, 0x4001335b00)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/loop.go:66 +0x158
k8s.io/apimachinery/pkg/util/wait.PollUntilContextTimeout({0x36e6598?, 0x40004922a0?}, 0x3b9aca00, 0x4001335d28?, 0x1, 0x4001335b00)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:48 +0x8c
k8s.io/minikube/test/integration.PodWait({0x36e6598, 0x40004922a0}, 0x4004e93a40, {0x4001b5a198, 0x11}, {0x2994202, 0x14}, {0x29ac171, 0x1c}, 0x7dba821800)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:380 +0x22c
k8s.io/minikube/test/integration.validateAddonAfterStop({0x36e6598, 0x40004922a0}, 0x4004e93a40, {0x4001b5a198, 0x11}, {0x297870e?, 0x1f93b4b100161e84?}, {0x694328a1?, 0x400138ef58?}, {0x161f08?, ...})
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:285 +0xd4
k8s.io/minikube/test/integration.TestStartStop.func1.1.1.1(0x4004e93a40?)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:154 +0x44
testing.tRunner(0x4004e93a40, 0x4001354000)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 4903
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 4691 [chan receive, 32 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001740e40, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 4686
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 5091 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x400023c080?}, 0x4001746380?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 5090
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 5092 [chan receive, 13 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4004f5d920, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 5090
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 5095 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0x4001c56a10, 0x2)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4001c56a00)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4004f5d920)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x400032f340?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000084230?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000084230}, 0x4001469f38, {0x369e4a0, 0x40017574a0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x36f4250?, {0x369e4a0?, 0x40017574a0?}, 0xa0?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001bb39c0, 0x3b9aca00, 0x0, 0x1, 0x4000084230)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5092
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 5096 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000084230}, 0x400156d740, 0x400156d788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000084230}, 0xb0?, 0x400156d740, 0x400156d788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000084230?}, 0x36e6598?, 0x4001a2f0a0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x4001746380?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5092
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 5097 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 5096
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                    

Test pass (307/369)

Order passed test Duration
3 TestDownloadOnly/v1.28.0/json-events 6.44
4 TestDownloadOnly/v1.28.0/preload-exists 0
8 TestDownloadOnly/v1.28.0/LogsDuration 0.1
9 TestDownloadOnly/v1.28.0/DeleteAll 0.22
10 TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds 0.14
12 TestDownloadOnly/v1.34.3/json-events 6.65
13 TestDownloadOnly/v1.34.3/preload-exists 0
17 TestDownloadOnly/v1.34.3/LogsDuration 0.1
18 TestDownloadOnly/v1.34.3/DeleteAll 0.22
19 TestDownloadOnly/v1.34.3/DeleteAlwaysSucceeds 0.13
21 TestDownloadOnly/v1.35.0-rc.1/json-events 6.59
22 TestDownloadOnly/v1.35.0-rc.1/preload-exists 0
26 TestDownloadOnly/v1.35.0-rc.1/LogsDuration 0.08
27 TestDownloadOnly/v1.35.0-rc.1/DeleteAll 0.22
28 TestDownloadOnly/v1.35.0-rc.1/DeleteAlwaysSucceeds 0.14
30 TestBinaryMirror 0.59
34 TestAddons/PreSetup/EnablingAddonOnNonExistingCluster 0.08
35 TestAddons/PreSetup/DisablingAddonOnNonExistingCluster 0.08
36 TestAddons/Setup 127.59
38 TestAddons/serial/Volcano 42.85
40 TestAddons/serial/GCPAuth/Namespaces 0.2
41 TestAddons/serial/GCPAuth/FakeCredentials 8.86
44 TestAddons/parallel/Registry 15.32
45 TestAddons/parallel/RegistryCreds 0.73
46 TestAddons/parallel/Ingress 19.16
47 TestAddons/parallel/InspektorGadget 11.78
48 TestAddons/parallel/MetricsServer 5.82
50 TestAddons/parallel/CSI 53.35
51 TestAddons/parallel/Headlamp 16.82
52 TestAddons/parallel/CloudSpanner 6.78
53 TestAddons/parallel/LocalPath 51.38
54 TestAddons/parallel/NvidiaDevicePlugin 6.6
55 TestAddons/parallel/Yakd 10.83
57 TestAddons/StoppedEnableDisable 12.34
58 TestCertOptions 37.81
59 TestCertExpiration 234.46
61 TestForceSystemdFlag 48.17
62 TestForceSystemdEnv 40.67
63 TestDockerEnvContainerd 47.41
67 TestErrorSpam/setup 31.94
68 TestErrorSpam/start 0.81
69 TestErrorSpam/status 1.05
70 TestErrorSpam/pause 1.75
71 TestErrorSpam/unpause 1.83
72 TestErrorSpam/stop 1.66
75 TestFunctional/serial/CopySyncFile 0
76 TestFunctional/serial/StartWithProxy 48.26
77 TestFunctional/serial/AuditLog 0
78 TestFunctional/serial/SoftStart 7.22
79 TestFunctional/serial/KubeContext 0.07
80 TestFunctional/serial/KubectlGetPods 0.1
83 TestFunctional/serial/CacheCmd/cache/add_remote 3.58
84 TestFunctional/serial/CacheCmd/cache/add_local 1.33
85 TestFunctional/serial/CacheCmd/cache/CacheDelete 0.07
86 TestFunctional/serial/CacheCmd/cache/list 0.06
87 TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node 0.35
88 TestFunctional/serial/CacheCmd/cache/cache_reload 1.88
89 TestFunctional/serial/CacheCmd/cache/delete 0.12
90 TestFunctional/serial/MinikubeKubectlCmd 0.14
91 TestFunctional/serial/MinikubeKubectlCmdDirectly 0.14
92 TestFunctional/serial/ExtraConfig 51.4
93 TestFunctional/serial/ComponentHealth 0.09
94 TestFunctional/serial/LogsCmd 1.47
95 TestFunctional/serial/LogsFileCmd 1.5
96 TestFunctional/serial/InvalidService 4.92
98 TestFunctional/parallel/ConfigCmd 0.47
99 TestFunctional/parallel/DashboardCmd 10.35
100 TestFunctional/parallel/DryRun 0.49
101 TestFunctional/parallel/InternationalLanguage 0.21
102 TestFunctional/parallel/StatusCmd 1.1
106 TestFunctional/parallel/ServiceCmdConnect 7.73
107 TestFunctional/parallel/AddonsCmd 0.13
108 TestFunctional/parallel/PersistentVolumeClaim 22.96
110 TestFunctional/parallel/SSHCmd 0.71
111 TestFunctional/parallel/CpCmd 2.47
113 TestFunctional/parallel/FileSync 0.37
114 TestFunctional/parallel/CertSync 2.21
118 TestFunctional/parallel/NodeLabels 0.11
120 TestFunctional/parallel/NonActiveRuntimeDisabled 0.56
122 TestFunctional/parallel/License 0.25
124 TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel 0.61
125 TestFunctional/parallel/TunnelCmd/serial/StartTunnel 0
127 TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup 8.47
128 TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP 0.09
129 TestFunctional/parallel/TunnelCmd/serial/AccessDirect 0
133 TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel 0.11
134 TestFunctional/parallel/ServiceCmd/DeployApp 8.38
135 TestFunctional/parallel/ProfileCmd/profile_not_create 0.45
136 TestFunctional/parallel/ProfileCmd/profile_list 0.42
137 TestFunctional/parallel/ProfileCmd/profile_json_output 0.43
138 TestFunctional/parallel/ServiceCmd/List 0.74
139 TestFunctional/parallel/MountCmd/any-port 8.71
140 TestFunctional/parallel/ServiceCmd/JSONOutput 0.52
141 TestFunctional/parallel/ServiceCmd/HTTPS 0.54
142 TestFunctional/parallel/ServiceCmd/Format 0.53
143 TestFunctional/parallel/ServiceCmd/URL 0.47
144 TestFunctional/parallel/MountCmd/specific-port 2.01
145 TestFunctional/parallel/MountCmd/VerifyCleanup 2.77
146 TestFunctional/parallel/Version/short 0.07
147 TestFunctional/parallel/Version/components 1.38
148 TestFunctional/parallel/ImageCommands/ImageListShort 0.3
149 TestFunctional/parallel/ImageCommands/ImageListTable 0.29
150 TestFunctional/parallel/ImageCommands/ImageListJson 0.27
151 TestFunctional/parallel/ImageCommands/ImageListYaml 0.27
152 TestFunctional/parallel/ImageCommands/ImageBuild 4.01
153 TestFunctional/parallel/ImageCommands/Setup 0.61
154 TestFunctional/parallel/ImageCommands/ImageLoadDaemon 1.33
155 TestFunctional/parallel/ImageCommands/ImageReloadDaemon 1.28
156 TestFunctional/parallel/UpdateContextCmd/no_changes 0.19
157 TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster 0.2
158 TestFunctional/parallel/UpdateContextCmd/no_clusters 0.19
159 TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon 1.44
160 TestFunctional/parallel/ImageCommands/ImageSaveToFile 0.4
161 TestFunctional/parallel/ImageCommands/ImageRemove 0.54
162 TestFunctional/parallel/ImageCommands/ImageLoadFromFile 0.63
163 TestFunctional/parallel/ImageCommands/ImageSaveDaemon 0.38
164 TestFunctional/delete_echo-server_images 0.05
165 TestFunctional/delete_my-image_image 0.02
166 TestFunctional/delete_minikube_cached_images 0.02
170 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CopySyncFile 0
172 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/AuditLog 0
174 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubeContext 0.06
178 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_remote 3.31
179 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_local 1.05
180 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/CacheDelete 0.07
181 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/list 0.06
182 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/verify_cache_inside_node 0.28
183 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/cache_reload 2.11
184 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/delete 0.12
189 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsCmd 0.94
190 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsFileCmd 0.94
193 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ConfigCmd 0.5
195 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DryRun 0.46
196 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/InternationalLanguage 0.2
202 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/AddonsCmd 0.14
205 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/SSHCmd 0.74
206 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CpCmd 2.04
208 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/FileSync 0.34
209 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CertSync 2.09
215 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NonActiveRuntimeDisabled 0.71
217 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/License 0.33
218 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/short 0.06
219 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/components 0.5
220 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListShort 0.22
221 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListTable 0.22
222 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListJson 0.23
223 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListYaml 0.23
224 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageBuild 3.67
225 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/Setup 0.28
226 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadDaemon 1.46
227 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageReloadDaemon 1.4
228 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageTagAndLoadDaemon 1.61
229 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_changes 0.16
230 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_minikube_cluster 0.15
231 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_clusters 0.15
232 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveToFile 0.42
233 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageRemove 0.58
234 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadFromFile 0.83
238 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveDaemon 0.51
244 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/StartTunnel 0
251 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DeleteTunnel 0.1
252 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_not_create 0.39
253 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_list 0.39
254 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_json_output 0.38
256 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/specific-port 1.72
257 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/VerifyCleanup 1.82
258 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_echo-server_images 0.04
259 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_my-image_image 0.02
260 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_minikube_cached_images 0.02
264 TestMultiControlPlane/serial/StartCluster 155.03
265 TestMultiControlPlane/serial/DeployApp 7.68
266 TestMultiControlPlane/serial/PingHostFromPods 1.56
267 TestMultiControlPlane/serial/AddWorkerNode 32.32
268 TestMultiControlPlane/serial/NodeLabels 0.11
269 TestMultiControlPlane/serial/HAppyAfterClusterStart 1.06
270 TestMultiControlPlane/serial/CopyFile 20.31
271 TestMultiControlPlane/serial/StopSecondaryNode 13.02
272 TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop 0.8
273 TestMultiControlPlane/serial/RestartSecondaryNode 13.94
274 TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart 1.39
275 TestMultiControlPlane/serial/RestartClusterKeepsNodes 98.86
276 TestMultiControlPlane/serial/DeleteSecondaryNode 11.18
277 TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete 0.85
278 TestMultiControlPlane/serial/StopCluster 36.48
279 TestMultiControlPlane/serial/RestartCluster 59.27
280 TestMultiControlPlane/serial/DegradedAfterClusterRestart 0.82
281 TestMultiControlPlane/serial/AddSecondaryNode 83.92
282 TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd 1.08
287 TestJSONOutput/start/Command 48.22
288 TestJSONOutput/start/Audit 0
290 TestJSONOutput/start/parallel/DistinctCurrentSteps 0
291 TestJSONOutput/start/parallel/IncreasingCurrentSteps 0
293 TestJSONOutput/pause/Command 0.77
294 TestJSONOutput/pause/Audit 0
296 TestJSONOutput/pause/parallel/DistinctCurrentSteps 0
297 TestJSONOutput/pause/parallel/IncreasingCurrentSteps 0
299 TestJSONOutput/unpause/Command 0.62
300 TestJSONOutput/unpause/Audit 0
302 TestJSONOutput/unpause/parallel/DistinctCurrentSteps 0
303 TestJSONOutput/unpause/parallel/IncreasingCurrentSteps 0
305 TestJSONOutput/stop/Command 5.97
306 TestJSONOutput/stop/Audit 0
308 TestJSONOutput/stop/parallel/DistinctCurrentSteps 0
309 TestJSONOutput/stop/parallel/IncreasingCurrentSteps 0
310 TestErrorJSONOutput 0.24
312 TestKicCustomNetwork/create_custom_network 41.51
313 TestKicCustomNetwork/use_default_bridge_network 40.69
314 TestKicExistingNetwork 36.12
315 TestKicCustomSubnet 35.66
316 TestKicStaticIP 35.88
317 TestMainNoArgs 0.05
318 TestMinikubeProfile 70.3
321 TestMountStart/serial/StartWithMountFirst 8.74
322 TestMountStart/serial/VerifyMountFirst 0.27
323 TestMountStart/serial/StartWithMountSecond 8.76
324 TestMountStart/serial/VerifyMountSecond 0.27
325 TestMountStart/serial/DeleteFirst 1.69
326 TestMountStart/serial/VerifyMountPostDelete 0.26
327 TestMountStart/serial/Stop 1.29
328 TestMountStart/serial/RestartStopped 7.98
329 TestMountStart/serial/VerifyMountPostStop 0.31
332 TestMultiNode/serial/FreshStart2Nodes 81.68
333 TestMultiNode/serial/DeployApp2Nodes 5.36
334 TestMultiNode/serial/PingHostFrom2Pods 1.06
335 TestMultiNode/serial/AddNode 29.52
336 TestMultiNode/serial/MultiNodeLabels 0.09
337 TestMultiNode/serial/ProfileList 0.73
338 TestMultiNode/serial/CopyFile 10.3
339 TestMultiNode/serial/StopNode 2.43
340 TestMultiNode/serial/StartAfterStop 8.18
341 TestMultiNode/serial/RestartKeepsNodes 81.48
342 TestMultiNode/serial/DeleteNode 5.68
343 TestMultiNode/serial/StopMultiNode 24.11
344 TestMultiNode/serial/RestartMultiNode 56.36
345 TestMultiNode/serial/ValidateNameConflict 36.14
350 TestPreload 119.86
352 TestScheduledStopUnix 105.04
355 TestInsufficientStorage 12.61
356 TestRunningBinaryUpgrade 66.08
359 TestMissingContainerUpgrade 132.85
361 TestNoKubernetes/serial/StartNoK8sWithVersion 0.1
362 TestNoKubernetes/serial/StartWithK8s 40.37
363 TestNoKubernetes/serial/StartWithStopK8s 19.78
364 TestNoKubernetes/serial/Start 8.52
365 TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads 0
366 TestNoKubernetes/serial/VerifyK8sNotRunning 0.29
367 TestNoKubernetes/serial/ProfileList 0.7
368 TestNoKubernetes/serial/Stop 1.3
369 TestNoKubernetes/serial/StartNoArgs 6.78
370 TestNoKubernetes/serial/VerifyK8sNotRunningSecond 0.28
378 TestNetworkPlugins/group/false 4.76
382 TestStoppedBinaryUpgrade/Setup 1.26
383 TestStoppedBinaryUpgrade/Upgrade 304.9
384 TestStoppedBinaryUpgrade/MinikubeLogs 2.14
393 TestPause/serial/Start 51.23
394 TestPause/serial/SecondStartNoReconfiguration 6.45
395 TestPause/serial/Pause 0.71
396 TestPause/serial/VerifyStatus 0.36
397 TestPause/serial/Unpause 0.67
398 TestPause/serial/PauseAgain 0.86
399 TestPause/serial/DeletePaused 2.86
400 TestPause/serial/VerifyDeletedResources 0.41
401 TestNetworkPlugins/group/auto/Start 53.26
402 TestNetworkPlugins/group/auto/KubeletFlags 0.33
403 TestNetworkPlugins/group/auto/NetCatPod 10.28
404 TestNetworkPlugins/group/auto/DNS 0.2
405 TestNetworkPlugins/group/auto/Localhost 0.15
406 TestNetworkPlugins/group/auto/HairPin 0.15
407 TestNetworkPlugins/group/kindnet/Start 46.74
408 TestNetworkPlugins/group/kindnet/ControllerPod 6.01
409 TestNetworkPlugins/group/kindnet/KubeletFlags 0.31
410 TestNetworkPlugins/group/kindnet/NetCatPod 10.25
411 TestNetworkPlugins/group/kindnet/DNS 0.19
412 TestNetworkPlugins/group/kindnet/Localhost 0.15
413 TestNetworkPlugins/group/kindnet/HairPin 0.14
414 TestNetworkPlugins/group/calico/Start 80.33
415 TestNetworkPlugins/group/calico/ControllerPod 6.01
416 TestNetworkPlugins/group/calico/KubeletFlags 0.32
417 TestNetworkPlugins/group/calico/NetCatPod 8.26
418 TestNetworkPlugins/group/calico/DNS 0.19
419 TestNetworkPlugins/group/calico/Localhost 0.14
420 TestNetworkPlugins/group/calico/HairPin 0.18
421 TestNetworkPlugins/group/custom-flannel/Start 57.84
422 TestNetworkPlugins/group/enable-default-cni/Start 82.33
423 TestNetworkPlugins/group/custom-flannel/KubeletFlags 0.31
424 TestNetworkPlugins/group/custom-flannel/NetCatPod 9.26
425 TestNetworkPlugins/group/custom-flannel/DNS 0.23
426 TestNetworkPlugins/group/custom-flannel/Localhost 0.19
427 TestNetworkPlugins/group/custom-flannel/HairPin 0.2
428 TestNetworkPlugins/group/flannel/Start 57.64
429 TestNetworkPlugins/group/enable-default-cni/KubeletFlags 0.45
430 TestNetworkPlugins/group/enable-default-cni/NetCatPod 9.38
431 TestNetworkPlugins/group/enable-default-cni/DNS 0.18
432 TestNetworkPlugins/group/enable-default-cni/Localhost 0.15
433 TestNetworkPlugins/group/enable-default-cni/HairPin 0.15
434 TestNetworkPlugins/group/flannel/ControllerPod 6.01
435 TestNetworkPlugins/group/flannel/KubeletFlags 0.4
436 TestNetworkPlugins/group/flannel/NetCatPod 10.36
437 TestNetworkPlugins/group/bridge/Start 77.02
438 TestNetworkPlugins/group/flannel/DNS 0.36
439 TestNetworkPlugins/group/flannel/Localhost 0.31
440 TestNetworkPlugins/group/flannel/HairPin 0.18
443 TestNetworkPlugins/group/bridge/KubeletFlags 0.38
444 TestNetworkPlugins/group/bridge/NetCatPod 11.38
445 TestNetworkPlugins/group/bridge/DNS 0.18
446 TestNetworkPlugins/group/bridge/Localhost 0.2
447 TestNetworkPlugins/group/bridge/HairPin 0.16
x
+
TestDownloadOnly/v1.28.0/json-events (6.44s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/json-events
aaa_download_only_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -o=json --download-only -p download-only-808830 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -o=json --download-only -p download-only-808830 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd: (6.437335615s)
--- PASS: TestDownloadOnly/v1.28.0/json-events (6.44s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/preload-exists
I1217 20:07:25.133630  369461 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime containerd
I1217 20:07:25.133747  369461 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4
--- PASS: TestDownloadOnly/v1.28.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/LogsDuration (0.1s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/LogsDuration
aaa_download_only_test.go:183: (dbg) Run:  out/minikube-linux-arm64 logs -p download-only-808830
aaa_download_only_test.go:183: (dbg) Non-zero exit: out/minikube-linux-arm64 logs -p download-only-808830: exit status 85 (97.735371ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────┬─────────┬─────────┬─────────────────────┬──────────┐
	│ COMMAND │                                                                                         ARGS                                                                                          │       PROFILE        │  USER   │ VERSION │     START TIME      │ END TIME │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────┼─────────┼─────────┼─────────────────────┼──────────┤
	│ start   │ -o=json --download-only -p download-only-808830 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-808830 │ jenkins │ v1.37.0 │ 17 Dec 25 20:07 UTC │          │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────┴─────────┴─────────┴─────────────────────┴──────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/17 20:07:18
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1217 20:07:18.743003  369467 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:07:18.743233  369467 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:07:18.743262  369467 out.go:374] Setting ErrFile to fd 2...
	I1217 20:07:18.743283  369467 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:07:18.743580  369467 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	W1217 20:07:18.743752  369467 root.go:314] Error reading config file at /home/jenkins/minikube-integration/21808-367595/.minikube/config/config.json: open /home/jenkins/minikube-integration/21808-367595/.minikube/config/config.json: no such file or directory
	I1217 20:07:18.744336  369467 out.go:368] Setting JSON to true
	I1217 20:07:18.745239  369467 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":10184,"bootTime":1765991855,"procs":154,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:07:18.745337  369467 start.go:143] virtualization:  
	I1217 20:07:18.751891  369467 out.go:99] [download-only-808830] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	W1217 20:07:18.752079  369467 preload.go:354] Failed to list preload files: open /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball: no such file or directory
	I1217 20:07:18.752183  369467 notify.go:221] Checking for updates...
	I1217 20:07:18.755368  369467 out.go:171] MINIKUBE_LOCATION=21808
	I1217 20:07:18.758717  369467 out.go:171] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:07:18.761819  369467 out.go:171] KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:07:18.764924  369467 out.go:171] MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:07:18.767869  369467 out.go:171] MINIKUBE_BIN=out/minikube-linux-arm64
	W1217 20:07:18.773818  369467 out.go:336] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1217 20:07:18.774079  369467 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:07:18.797808  369467 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:07:18.797926  369467 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:07:18.861320  369467 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:61 SystemTime:2025-12-17 20:07:18.851215513 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:07:18.861424  369467 docker.go:319] overlay module found
	I1217 20:07:18.864495  369467 out.go:99] Using the docker driver based on user configuration
	I1217 20:07:18.864541  369467 start.go:309] selected driver: docker
	I1217 20:07:18.864549  369467 start.go:927] validating driver "docker" against <nil>
	I1217 20:07:18.864656  369467 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:07:18.920171  369467 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:61 SystemTime:2025-12-17 20:07:18.910875302 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:07:18.920368  369467 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1217 20:07:18.920665  369467 start_flags.go:410] Using suggested 3072MB memory alloc based on sys=7834MB, container=7834MB
	I1217 20:07:18.920833  369467 start_flags.go:974] Wait components to verify : map[apiserver:true system_pods:true]
	I1217 20:07:18.924039  369467 out.go:171] Using Docker driver with root privileges
	I1217 20:07:18.927152  369467 cni.go:84] Creating CNI manager for ""
	I1217 20:07:18.927230  369467 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:07:18.927243  369467 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1217 20:07:18.927340  369467 start.go:353] cluster config:
	{Name:download-only-808830 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.0 ClusterName:download-only-808830 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Co
ntainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.28.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:07:18.930424  369467 out.go:99] Starting "download-only-808830" primary control-plane node in "download-only-808830" cluster
	I1217 20:07:18.930467  369467 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1217 20:07:18.933513  369467 out.go:99] Pulling base image v0.0.48-1765661130-22141 ...
	I1217 20:07:18.933571  369467 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime containerd
	I1217 20:07:18.933652  369467 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon
	I1217 20:07:18.949874  369467 cache.go:163] Downloading gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 to local cache
	I1217 20:07:18.950071  369467 image.go:65] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local cache directory
	I1217 20:07:18.950166  369467 image.go:150] Writing gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 to local cache
	I1217 20:07:19.000627  369467 preload.go:148] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.28.0/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4
	I1217 20:07:19.000676  369467 cache.go:65] Caching tarball of preloaded images
	I1217 20:07:19.002347  369467 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime containerd
	I1217 20:07:19.005908  369467 out.go:99] Downloading Kubernetes v1.28.0 preload ...
	I1217 20:07:19.005949  369467 preload.go:318] getting checksum for preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4 from gcs api...
	I1217 20:07:19.103286  369467 preload.go:295] Got checksum from GCS API "38d7f581f2fa4226c8af2c9106b982b7"
	I1217 20:07:19.103420  369467 download.go:108] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.28.0/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4?checksum=md5:38d7f581f2fa4226c8af2c9106b982b7 -> /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4
	
	
	* The control-plane node download-only-808830 host does not exist
	  To start a cluster, run: "minikube start -p download-only-808830"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:184: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.28.0/LogsDuration (0.10s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/DeleteAll (0.22s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/DeleteAll
aaa_download_only_test.go:196: (dbg) Run:  out/minikube-linux-arm64 delete --all
--- PASS: TestDownloadOnly/v1.28.0/DeleteAll (0.22s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds (0.14s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds
aaa_download_only_test.go:207: (dbg) Run:  out/minikube-linux-arm64 delete -p download-only-808830
--- PASS: TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds (0.14s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/json-events (6.65s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/json-events
aaa_download_only_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -o=json --download-only -p download-only-893616 --force --alsologtostderr --kubernetes-version=v1.34.3 --container-runtime=containerd --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -o=json --download-only -p download-only-893616 --force --alsologtostderr --kubernetes-version=v1.34.3 --container-runtime=containerd --driver=docker  --container-runtime=containerd: (6.649436037s)
--- PASS: TestDownloadOnly/v1.34.3/json-events (6.65s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/preload-exists
I1217 20:07:32.243675  369461 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime containerd
I1217 20:07:32.243710  369461 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-containerd-overlay2-arm64.tar.lz4
--- PASS: TestDownloadOnly/v1.34.3/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/LogsDuration (0.1s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/LogsDuration
aaa_download_only_test.go:183: (dbg) Run:  out/minikube-linux-arm64 logs -p download-only-893616
aaa_download_only_test.go:183: (dbg) Non-zero exit: out/minikube-linux-arm64 logs -p download-only-893616: exit status 85 (94.431459ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                                         ARGS                                                                                          │       PROFILE        │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ start   │ -o=json --download-only -p download-only-808830 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-808830 │ jenkins │ v1.37.0 │ 17 Dec 25 20:07 UTC │                     │
	│ delete  │ --all                                                                                                                                                                                 │ minikube             │ jenkins │ v1.37.0 │ 17 Dec 25 20:07 UTC │ 17 Dec 25 20:07 UTC │
	│ delete  │ -p download-only-808830                                                                                                                                                               │ download-only-808830 │ jenkins │ v1.37.0 │ 17 Dec 25 20:07 UTC │ 17 Dec 25 20:07 UTC │
	│ start   │ -o=json --download-only -p download-only-893616 --force --alsologtostderr --kubernetes-version=v1.34.3 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-893616 │ jenkins │ v1.37.0 │ 17 Dec 25 20:07 UTC │                     │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/17 20:07:25
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1217 20:07:25.639878  369672 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:07:25.641028  369672 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:07:25.641085  369672 out.go:374] Setting ErrFile to fd 2...
	I1217 20:07:25.641108  369672 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:07:25.641390  369672 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:07:25.641861  369672 out.go:368] Setting JSON to true
	I1217 20:07:25.642704  369672 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":10191,"bootTime":1765991855,"procs":149,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:07:25.642820  369672 start.go:143] virtualization:  
	I1217 20:07:25.646291  369672 out.go:99] [download-only-893616] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1217 20:07:25.646564  369672 notify.go:221] Checking for updates...
	I1217 20:07:25.649568  369672 out.go:171] MINIKUBE_LOCATION=21808
	I1217 20:07:25.652684  369672 out.go:171] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:07:25.655610  369672 out.go:171] KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:07:25.658484  369672 out.go:171] MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:07:25.661299  369672 out.go:171] MINIKUBE_BIN=out/minikube-linux-arm64
	W1217 20:07:25.666995  369672 out.go:336] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1217 20:07:25.667299  369672 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:07:25.691149  369672 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:07:25.691260  369672 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:07:25.749064  369672 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:48 SystemTime:2025-12-17 20:07:25.73936469 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:07:25.749209  369672 docker.go:319] overlay module found
	I1217 20:07:25.752290  369672 out.go:99] Using the docker driver based on user configuration
	I1217 20:07:25.752327  369672 start.go:309] selected driver: docker
	I1217 20:07:25.752337  369672 start.go:927] validating driver "docker" against <nil>
	I1217 20:07:25.752448  369672 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:07:25.816178  369672 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:48 SystemTime:2025-12-17 20:07:25.806709717 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:07:25.816349  369672 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1217 20:07:25.816640  369672 start_flags.go:410] Using suggested 3072MB memory alloc based on sys=7834MB, container=7834MB
	I1217 20:07:25.816798  369672 start_flags.go:974] Wait components to verify : map[apiserver:true system_pods:true]
	I1217 20:07:25.819975  369672 out.go:171] Using Docker driver with root privileges
	I1217 20:07:25.822784  369672 cni.go:84] Creating CNI manager for ""
	I1217 20:07:25.822856  369672 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:07:25.822874  369672 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1217 20:07:25.822956  369672 start.go:353] cluster config:
	{Name:download-only-893616 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:download-only-893616 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Co
ntainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:07:25.825928  369672 out.go:99] Starting "download-only-893616" primary control-plane node in "download-only-893616" cluster
	I1217 20:07:25.825955  369672 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1217 20:07:25.828802  369672 out.go:99] Pulling base image v0.0.48-1765661130-22141 ...
	I1217 20:07:25.828849  369672 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime containerd
	I1217 20:07:25.828971  369672 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon
	I1217 20:07:25.845221  369672 cache.go:163] Downloading gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 to local cache
	I1217 20:07:25.845366  369672 image.go:65] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local cache directory
	I1217 20:07:25.845393  369672 image.go:68] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local cache directory, skipping pull
	I1217 20:07:25.845400  369672 image.go:137] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 exists in cache, skipping pull
	I1217 20:07:25.845409  369672 cache.go:166] successfully saved gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 as a tarball
	I1217 20:07:25.890007  369672 preload.go:148] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.34.3/preloaded-images-k8s-v18-v1.34.3-containerd-overlay2-arm64.tar.lz4
	I1217 20:07:25.890037  369672 cache.go:65] Caching tarball of preloaded images
	I1217 20:07:25.890216  369672 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime containerd
	I1217 20:07:25.893214  369672 out.go:99] Downloading Kubernetes v1.34.3 preload ...
	I1217 20:07:25.893243  369672 preload.go:318] getting checksum for preloaded-images-k8s-v18-v1.34.3-containerd-overlay2-arm64.tar.lz4 from gcs api...
	I1217 20:07:25.986101  369672 preload.go:295] Got checksum from GCS API "cec854b4ba05b56d256f7c601add2b98"
	I1217 20:07:25.986176  369672 download.go:108] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.34.3/preloaded-images-k8s-v18-v1.34.3-containerd-overlay2-arm64.tar.lz4?checksum=md5:cec854b4ba05b56d256f7c601add2b98 -> /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-containerd-overlay2-arm64.tar.lz4
	
	
	* The control-plane node download-only-893616 host does not exist
	  To start a cluster, run: "minikube start -p download-only-893616"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:184: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.34.3/LogsDuration (0.10s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/DeleteAll (0.22s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/DeleteAll
aaa_download_only_test.go:196: (dbg) Run:  out/minikube-linux-arm64 delete --all
--- PASS: TestDownloadOnly/v1.34.3/DeleteAll (0.22s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/DeleteAlwaysSucceeds (0.13s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/DeleteAlwaysSucceeds
aaa_download_only_test.go:207: (dbg) Run:  out/minikube-linux-arm64 delete -p download-only-893616
--- PASS: TestDownloadOnly/v1.34.3/DeleteAlwaysSucceeds (0.13s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/json-events (6.59s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/json-events
aaa_download_only_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -o=json --download-only -p download-only-462834 --force --alsologtostderr --kubernetes-version=v1.35.0-rc.1 --container-runtime=containerd --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -o=json --download-only -p download-only-462834 --force --alsologtostderr --kubernetes-version=v1.35.0-rc.1 --container-runtime=containerd --driver=docker  --container-runtime=containerd: (6.588975996s)
--- PASS: TestDownloadOnly/v1.35.0-rc.1/json-events (6.59s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/preload-exists
I1217 20:07:39.283769  369461 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
I1217 20:07:39.283811  369461 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
--- PASS: TestDownloadOnly/v1.35.0-rc.1/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/LogsDuration (0.08s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/LogsDuration
aaa_download_only_test.go:183: (dbg) Run:  out/minikube-linux-arm64 logs -p download-only-462834
aaa_download_only_test.go:183: (dbg) Non-zero exit: out/minikube-linux-arm64 logs -p download-only-462834: exit status 85 (82.654138ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	┌─────────┬────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                                            ARGS                                                                                            │       PROFILE        │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ start   │ -o=json --download-only -p download-only-808830 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd      │ download-only-808830 │ jenkins │ v1.37.0 │ 17 Dec 25 20:07 UTC │                     │
	│ delete  │ --all                                                                                                                                                                                      │ minikube             │ jenkins │ v1.37.0 │ 17 Dec 25 20:07 UTC │ 17 Dec 25 20:07 UTC │
	│ delete  │ -p download-only-808830                                                                                                                                                                    │ download-only-808830 │ jenkins │ v1.37.0 │ 17 Dec 25 20:07 UTC │ 17 Dec 25 20:07 UTC │
	│ start   │ -o=json --download-only -p download-only-893616 --force --alsologtostderr --kubernetes-version=v1.34.3 --container-runtime=containerd --driver=docker  --container-runtime=containerd      │ download-only-893616 │ jenkins │ v1.37.0 │ 17 Dec 25 20:07 UTC │                     │
	│ delete  │ --all                                                                                                                                                                                      │ minikube             │ jenkins │ v1.37.0 │ 17 Dec 25 20:07 UTC │ 17 Dec 25 20:07 UTC │
	│ delete  │ -p download-only-893616                                                                                                                                                                    │ download-only-893616 │ jenkins │ v1.37.0 │ 17 Dec 25 20:07 UTC │ 17 Dec 25 20:07 UTC │
	│ start   │ -o=json --download-only -p download-only-462834 --force --alsologtostderr --kubernetes-version=v1.35.0-rc.1 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-462834 │ jenkins │ v1.37.0 │ 17 Dec 25 20:07 UTC │                     │
	└─────────┴────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/17 20:07:32
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1217 20:07:32.741764  369866 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:07:32.741995  369866 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:07:32.742025  369866 out.go:374] Setting ErrFile to fd 2...
	I1217 20:07:32.742047  369866 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:07:32.742307  369866 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:07:32.742738  369866 out.go:368] Setting JSON to true
	I1217 20:07:32.743614  369866 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":10198,"bootTime":1765991855,"procs":149,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:07:32.743704  369866 start.go:143] virtualization:  
	I1217 20:07:32.747028  369866 out.go:99] [download-only-462834] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1217 20:07:32.747275  369866 notify.go:221] Checking for updates...
	I1217 20:07:32.750243  369866 out.go:171] MINIKUBE_LOCATION=21808
	I1217 20:07:32.753244  369866 out.go:171] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:07:32.756187  369866 out.go:171] KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:07:32.759034  369866 out.go:171] MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:07:32.761933  369866 out.go:171] MINIKUBE_BIN=out/minikube-linux-arm64
	W1217 20:07:32.767523  369866 out.go:336] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1217 20:07:32.767852  369866 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:07:32.792541  369866 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:07:32.792671  369866 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:07:32.848310  369866 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:47 SystemTime:2025-12-17 20:07:32.839494888 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:07:32.848412  369866 docker.go:319] overlay module found
	I1217 20:07:32.851371  369866 out.go:99] Using the docker driver based on user configuration
	I1217 20:07:32.851396  369866 start.go:309] selected driver: docker
	I1217 20:07:32.851404  369866 start.go:927] validating driver "docker" against <nil>
	I1217 20:07:32.851529  369866 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:07:32.904715  369866 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:47 SystemTime:2025-12-17 20:07:32.896196376 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:07:32.904872  369866 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1217 20:07:32.905174  369866 start_flags.go:410] Using suggested 3072MB memory alloc based on sys=7834MB, container=7834MB
	I1217 20:07:32.905336  369866 start_flags.go:974] Wait components to verify : map[apiserver:true system_pods:true]
	I1217 20:07:32.908396  369866 out.go:171] Using Docker driver with root privileges
	I1217 20:07:32.911120  369866 cni.go:84] Creating CNI manager for ""
	I1217 20:07:32.911185  369866 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1217 20:07:32.911199  369866 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1217 20:07:32.911277  369866 start.go:353] cluster config:
	{Name:download-only-462834 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:download-only-462834 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.loc
al ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:07:32.914228  369866 out.go:99] Starting "download-only-462834" primary control-plane node in "download-only-462834" cluster
	I1217 20:07:32.914246  369866 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1217 20:07:32.917155  369866 out.go:99] Pulling base image v0.0.48-1765661130-22141 ...
	I1217 20:07:32.917196  369866 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:07:32.917379  369866 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local docker daemon
	I1217 20:07:32.933007  369866 cache.go:163] Downloading gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 to local cache
	I1217 20:07:32.933128  369866 image.go:65] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local cache directory
	I1217 20:07:32.933155  369866 image.go:68] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 in local cache directory, skipping pull
	I1217 20:07:32.933160  369866 image.go:137] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 exists in cache, skipping pull
	I1217 20:07:32.933168  369866 cache.go:166] successfully saved gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 as a tarball
	I1217 20:07:32.971094  369866 preload.go:148] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-rc.1/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1217 20:07:32.971128  369866 cache.go:65] Caching tarball of preloaded images
	I1217 20:07:32.971297  369866 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1217 20:07:32.974198  369866 out.go:99] Downloading Kubernetes v1.35.0-rc.1 preload ...
	I1217 20:07:32.974225  369866 preload.go:318] getting checksum for preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 from gcs api...
	I1217 20:07:33.075123  369866 preload.go:295] Got checksum from GCS API "cc3cb8adfb5c28e7151fdb563481b9b1"
	I1217 20:07:33.075182  369866 download.go:108] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-rc.1/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4?checksum=md5:cc3cb8adfb5c28e7151fdb563481b9b1 -> /home/jenkins/minikube-integration/21808-367595/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	
	
	* The control-plane node download-only-462834 host does not exist
	  To start a cluster, run: "minikube start -p download-only-462834"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:184: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.35.0-rc.1/LogsDuration (0.08s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/DeleteAll (0.22s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/DeleteAll
aaa_download_only_test.go:196: (dbg) Run:  out/minikube-linux-arm64 delete --all
--- PASS: TestDownloadOnly/v1.35.0-rc.1/DeleteAll (0.22s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/DeleteAlwaysSucceeds (0.14s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/DeleteAlwaysSucceeds
aaa_download_only_test.go:207: (dbg) Run:  out/minikube-linux-arm64 delete -p download-only-462834
--- PASS: TestDownloadOnly/v1.35.0-rc.1/DeleteAlwaysSucceeds (0.14s)

                                                
                                    
x
+
TestBinaryMirror (0.59s)

                                                
                                                
=== RUN   TestBinaryMirror
I1217 20:07:40.587016  369461 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.3/bin/linux/arm64/kubectl?checksum=file:https://dl.k8s.io/release/v1.34.3/bin/linux/arm64/kubectl.sha256
aaa_download_only_test.go:309: (dbg) Run:  out/minikube-linux-arm64 start --download-only -p binary-mirror-591977 --alsologtostderr --binary-mirror http://127.0.0.1:36011 --driver=docker  --container-runtime=containerd
helpers_test.go:176: Cleaning up "binary-mirror-591977" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p binary-mirror-591977
--- PASS: TestBinaryMirror (0.59s)

                                                
                                    
x
+
TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.08s)

                                                
                                                
=== RUN   TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/EnablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
addons_test.go:1002: (dbg) Run:  out/minikube-linux-arm64 addons enable dashboard -p addons-060437
addons_test.go:1002: (dbg) Non-zero exit: out/minikube-linux-arm64 addons enable dashboard -p addons-060437: exit status 85 (82.719968ms)

                                                
                                                
-- stdout --
	* Profile "addons-060437" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-060437"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.08s)

                                                
                                    
x
+
TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.08s)

                                                
                                                
=== RUN   TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/DisablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
addons_test.go:1013: (dbg) Run:  out/minikube-linux-arm64 addons disable dashboard -p addons-060437
addons_test.go:1013: (dbg) Non-zero exit: out/minikube-linux-arm64 addons disable dashboard -p addons-060437: exit status 85 (75.089197ms)

                                                
                                                
-- stdout --
	* Profile "addons-060437" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-060437"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.08s)

                                                
                                    
x
+
TestAddons/Setup (127.59s)

                                                
                                                
=== RUN   TestAddons/Setup
addons_test.go:110: (dbg) Run:  out/minikube-linux-arm64 start -p addons-060437 --wait=true --memory=4096 --alsologtostderr --addons=registry --addons=registry-creds --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=nvidia-device-plugin --addons=yakd --addons=volcano --addons=amd-gpu-device-plugin --driver=docker  --container-runtime=containerd --addons=ingress --addons=ingress-dns --addons=storage-provisioner-rancher
addons_test.go:110: (dbg) Done: out/minikube-linux-arm64 start -p addons-060437 --wait=true --memory=4096 --alsologtostderr --addons=registry --addons=registry-creds --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=nvidia-device-plugin --addons=yakd --addons=volcano --addons=amd-gpu-device-plugin --driver=docker  --container-runtime=containerd --addons=ingress --addons=ingress-dns --addons=storage-provisioner-rancher: (2m7.589831051s)
--- PASS: TestAddons/Setup (127.59s)

                                                
                                    
x
+
TestAddons/serial/Volcano (42.85s)

                                                
                                                
=== RUN   TestAddons/serial/Volcano
addons_test.go:886: volcano-controller stabilized in 51.32794ms
addons_test.go:878: volcano-admission stabilized in 51.980162ms
addons_test.go:870: volcano-scheduler stabilized in 52.265637ms
addons_test.go:892: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-scheduler" in namespace "volcano-system" ...
helpers_test.go:353: "volcano-scheduler-76c996c8bf-b9zzc" [f2d05db4-1a3e-4b1c-b7b0-2604087214ce] Running
addons_test.go:892: (dbg) TestAddons/serial/Volcano: app=volcano-scheduler healthy within 6.004451241s
addons_test.go:896: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-admission" in namespace "volcano-system" ...
helpers_test.go:353: "volcano-admission-6c447bd768-mkhcm" [1ac661b1-bca3-4bf4-89ae-aa75e0b13528] Pending / Ready:ContainersNotReady (containers with unready status: [admission]) / ContainersReady:ContainersNotReady (containers with unready status: [admission])
helpers_test.go:353: "volcano-admission-6c447bd768-mkhcm" [1ac661b1-bca3-4bf4-89ae-aa75e0b13528] Running
addons_test.go:896: (dbg) TestAddons/serial/Volcano: app=volcano-admission healthy within 6.005592248s
addons_test.go:900: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-controller" in namespace "volcano-system" ...
helpers_test.go:353: "volcano-controllers-6fd4f85cb8-7qqd2" [79d9891f-05b0-4b83-83e3-04ff7a6dbd99] Running
addons_test.go:900: (dbg) TestAddons/serial/Volcano: app=volcano-controller healthy within 6.003137777s
addons_test.go:905: (dbg) Run:  kubectl --context addons-060437 delete -n volcano-system job volcano-admission-init
addons_test.go:911: (dbg) Run:  kubectl --context addons-060437 create -f testdata/vcjob.yaml
addons_test.go:919: (dbg) Run:  kubectl --context addons-060437 get vcjob -n my-volcano
addons_test.go:937: (dbg) TestAddons/serial/Volcano: waiting 3m0s for pods matching "volcano.sh/job-name=test-job" in namespace "my-volcano" ...
helpers_test.go:353: "test-job-nginx-0" [ad1b885d-4e8c-4aa7-8770-e7e9421eb82d] Pending
helpers_test.go:353: "test-job-nginx-0" [ad1b885d-4e8c-4aa7-8770-e7e9421eb82d] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:353: "test-job-nginx-0" [ad1b885d-4e8c-4aa7-8770-e7e9421eb82d] Running
addons_test.go:937: (dbg) TestAddons/serial/Volcano: volcano.sh/job-name=test-job healthy within 12.008321078s
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-060437 addons disable volcano --alsologtostderr -v=1
addons_test.go:1055: (dbg) Done: out/minikube-linux-arm64 -p addons-060437 addons disable volcano --alsologtostderr -v=1: (12.186078938s)
--- PASS: TestAddons/serial/Volcano (42.85s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/Namespaces (0.2s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/Namespaces
addons_test.go:632: (dbg) Run:  kubectl --context addons-060437 create ns new-namespace
addons_test.go:646: (dbg) Run:  kubectl --context addons-060437 get secret gcp-auth -n new-namespace
--- PASS: TestAddons/serial/GCPAuth/Namespaces (0.20s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/FakeCredentials (8.86s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/FakeCredentials
addons_test.go:677: (dbg) Run:  kubectl --context addons-060437 create -f testdata/busybox.yaml
addons_test.go:684: (dbg) Run:  kubectl --context addons-060437 create sa gcp-auth-test
addons_test.go:690: (dbg) TestAddons/serial/GCPAuth/FakeCredentials: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:353: "busybox" [e1979469-5f53-4bde-837c-50576e60b48e] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:353: "busybox" [e1979469-5f53-4bde-837c-50576e60b48e] Running
addons_test.go:690: (dbg) TestAddons/serial/GCPAuth/FakeCredentials: integration-test=busybox healthy within 8.003662227s
addons_test.go:696: (dbg) Run:  kubectl --context addons-060437 exec busybox -- /bin/sh -c "printenv GOOGLE_APPLICATION_CREDENTIALS"
addons_test.go:708: (dbg) Run:  kubectl --context addons-060437 describe sa gcp-auth-test
addons_test.go:722: (dbg) Run:  kubectl --context addons-060437 exec busybox -- /bin/sh -c "cat /google-app-creds.json"
addons_test.go:746: (dbg) Run:  kubectl --context addons-060437 exec busybox -- /bin/sh -c "printenv GOOGLE_CLOUD_PROJECT"
--- PASS: TestAddons/serial/GCPAuth/FakeCredentials (8.86s)

                                                
                                    
x
+
TestAddons/parallel/Registry (15.32s)

                                                
                                                
=== RUN   TestAddons/parallel/Registry
=== PAUSE TestAddons/parallel/Registry

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:384: registry stabilized in 10.043528ms
addons_test.go:386: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...
helpers_test.go:353: "registry-6b586f9694-wm48l" [82aaefd1-fb78-4557-9b38-fed7d8b63a10] Running
addons_test.go:386: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 5.003471123s
addons_test.go:389: (dbg) TestAddons/parallel/Registry: waiting 10m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers_test.go:353: "registry-proxy-4zlzr" [0f693ddd-f349-442a-bce0-fa1979cac7d9] Running
addons_test.go:389: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 5.009617813s
addons_test.go:394: (dbg) Run:  kubectl --context addons-060437 delete po -l run=registry-test --now
addons_test.go:399: (dbg) Run:  kubectl --context addons-060437 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"
addons_test.go:399: (dbg) Done: kubectl --context addons-060437 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": (4.219763167s)
addons_test.go:413: (dbg) Run:  out/minikube-linux-arm64 -p addons-060437 ip
2025/12/17 20:11:04 [DEBUG] GET http://192.168.49.2:5000
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-060437 addons disable registry --alsologtostderr -v=1
--- PASS: TestAddons/parallel/Registry (15.32s)

                                                
                                    
x
+
TestAddons/parallel/RegistryCreds (0.73s)

                                                
                                                
=== RUN   TestAddons/parallel/RegistryCreds
=== PAUSE TestAddons/parallel/RegistryCreds

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/RegistryCreds
addons_test.go:325: registry-creds stabilized in 2.995038ms
addons_test.go:327: (dbg) Run:  out/minikube-linux-arm64 addons configure registry-creds -f ./testdata/addons_testconfig.json -p addons-060437
addons_test.go:334: (dbg) Run:  kubectl --context addons-060437 -n kube-system get secret -o yaml
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-060437 addons disable registry-creds --alsologtostderr -v=1
--- PASS: TestAddons/parallel/RegistryCreds (0.73s)

                                                
                                    
x
+
TestAddons/parallel/Ingress (19.16s)

                                                
                                                
=== RUN   TestAddons/parallel/Ingress
=== PAUSE TestAddons/parallel/Ingress

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:211: (dbg) Run:  kubectl --context addons-060437 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:236: (dbg) Run:  kubectl --context addons-060437 replace --force -f testdata/nginx-ingress-v1.yaml
addons_test.go:249: (dbg) Run:  kubectl --context addons-060437 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:254: (dbg) TestAddons/parallel/Ingress: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:353: "nginx" [350176ac-abb6-4d4c-9193-079547c77f3d] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:353: "nginx" [350176ac-abb6-4d4c-9193-079547c77f3d] Running
addons_test.go:254: (dbg) TestAddons/parallel/Ingress: run=nginx healthy within 7.003880705s
I1217 20:12:12.588633  369461 kapi.go:150] Service nginx in namespace default found.
addons_test.go:266: (dbg) Run:  out/minikube-linux-arm64 -p addons-060437 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:290: (dbg) Run:  kubectl --context addons-060437 replace --force -f testdata/ingress-dns-example-v1.yaml
addons_test.go:295: (dbg) Run:  out/minikube-linux-arm64 -p addons-060437 ip
addons_test.go:301: (dbg) Run:  nslookup hello-john.test 192.168.49.2
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-060437 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:1055: (dbg) Done: out/minikube-linux-arm64 -p addons-060437 addons disable ingress-dns --alsologtostderr -v=1: (1.945040408s)
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-060437 addons disable ingress --alsologtostderr -v=1
addons_test.go:1055: (dbg) Done: out/minikube-linux-arm64 -p addons-060437 addons disable ingress --alsologtostderr -v=1: (8.147102901s)
--- PASS: TestAddons/parallel/Ingress (19.16s)

                                                
                                    
x
+
TestAddons/parallel/InspektorGadget (11.78s)

                                                
                                                
=== RUN   TestAddons/parallel/InspektorGadget
=== PAUSE TestAddons/parallel/InspektorGadget

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/InspektorGadget
addons_test.go:825: (dbg) TestAddons/parallel/InspektorGadget: waiting 8m0s for pods matching "k8s-app=gadget" in namespace "gadget" ...
helpers_test.go:353: "gadget-kk4t6" [9fa05120-4929-417f-86b9-7b1e3d839dd2] Running
addons_test.go:825: (dbg) TestAddons/parallel/InspektorGadget: k8s-app=gadget healthy within 6.003029054s
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-060437 addons disable inspektor-gadget --alsologtostderr -v=1
addons_test.go:1055: (dbg) Done: out/minikube-linux-arm64 -p addons-060437 addons disable inspektor-gadget --alsologtostderr -v=1: (5.774044278s)
--- PASS: TestAddons/parallel/InspektorGadget (11.78s)

                                                
                                    
x
+
TestAddons/parallel/MetricsServer (5.82s)

                                                
                                                
=== RUN   TestAddons/parallel/MetricsServer
=== PAUSE TestAddons/parallel/MetricsServer

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:457: metrics-server stabilized in 4.672966ms
addons_test.go:459: (dbg) TestAddons/parallel/MetricsServer: waiting 6m0s for pods matching "k8s-app=metrics-server" in namespace "kube-system" ...
helpers_test.go:353: "metrics-server-85b7d694d7-rhhdj" [f6a3cfab-65a0-4fc8-87f4-e89f46fd25ca] Running
addons_test.go:459: (dbg) TestAddons/parallel/MetricsServer: k8s-app=metrics-server healthy within 5.003506372s
addons_test.go:465: (dbg) Run:  kubectl --context addons-060437 top pods -n kube-system
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-060437 addons disable metrics-server --alsologtostderr -v=1
--- PASS: TestAddons/parallel/MetricsServer (5.82s)

                                                
                                    
x
+
TestAddons/parallel/CSI (53.35s)

                                                
                                                
=== RUN   TestAddons/parallel/CSI
=== PAUSE TestAddons/parallel/CSI

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CSI
I1217 20:11:28.862613  369461 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=csi-hostpath-driver" in ns "kube-system" ...
I1217 20:11:28.866522  369461 kapi.go:86] Found 3 Pods for label selector kubernetes.io/minikube-addons=csi-hostpath-driver
I1217 20:11:28.866551  369461 kapi.go:107] duration metric: took 7.872222ms to wait for kubernetes.io/minikube-addons=csi-hostpath-driver ...
addons_test.go:551: csi-hostpath-driver pods stabilized in 7.883332ms
addons_test.go:554: (dbg) Run:  kubectl --context addons-060437 create -f testdata/csi-hostpath-driver/pvc.yaml
addons_test.go:559: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc" in namespace "default" ...
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc -o jsonpath={.status.phase} -n default
addons_test.go:564: (dbg) Run:  kubectl --context addons-060437 create -f testdata/csi-hostpath-driver/pv-pod.yaml
addons_test.go:569: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod" in namespace "default" ...
helpers_test.go:353: "task-pv-pod" [21abe71a-bef6-449c-9ea5-0c450ad10af1] Pending
helpers_test.go:353: "task-pv-pod" [21abe71a-bef6-449c-9ea5-0c450ad10af1] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:353: "task-pv-pod" [21abe71a-bef6-449c-9ea5-0c450ad10af1] Running
addons_test.go:569: (dbg) TestAddons/parallel/CSI: app=task-pv-pod healthy within 9.005563837s
addons_test.go:574: (dbg) Run:  kubectl --context addons-060437 create -f testdata/csi-hostpath-driver/snapshot.yaml
addons_test.go:579: (dbg) TestAddons/parallel/CSI: waiting 6m0s for volume snapshot "new-snapshot-demo" in namespace "default" ...
helpers_test.go:428: (dbg) Run:  kubectl --context addons-060437 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
helpers_test.go:428: (dbg) Run:  kubectl --context addons-060437 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
addons_test.go:584: (dbg) Run:  kubectl --context addons-060437 delete pod task-pv-pod
addons_test.go:584: (dbg) Done: kubectl --context addons-060437 delete pod task-pv-pod: (1.217740131s)
addons_test.go:590: (dbg) Run:  kubectl --context addons-060437 delete pvc hpvc
addons_test.go:596: (dbg) Run:  kubectl --context addons-060437 create -f testdata/csi-hostpath-driver/pvc-restore.yaml
addons_test.go:601: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc-restore" in namespace "default" ...
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
addons_test.go:606: (dbg) Run:  kubectl --context addons-060437 create -f testdata/csi-hostpath-driver/pv-pod-restore.yaml
addons_test.go:611: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod-restore" in namespace "default" ...
helpers_test.go:353: "task-pv-pod-restore" [cae11425-2599-4632-b069-314a9e11ef8e] Pending
helpers_test.go:353: "task-pv-pod-restore" [cae11425-2599-4632-b069-314a9e11ef8e] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:353: "task-pv-pod-restore" [cae11425-2599-4632-b069-314a9e11ef8e] Running
addons_test.go:611: (dbg) TestAddons/parallel/CSI: app=task-pv-pod-restore healthy within 8.004099978s
addons_test.go:616: (dbg) Run:  kubectl --context addons-060437 delete pod task-pv-pod-restore
addons_test.go:620: (dbg) Run:  kubectl --context addons-060437 delete pvc hpvc-restore
addons_test.go:624: (dbg) Run:  kubectl --context addons-060437 delete volumesnapshot new-snapshot-demo
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-060437 addons disable volumesnapshots --alsologtostderr -v=1
addons_test.go:1055: (dbg) Done: out/minikube-linux-arm64 -p addons-060437 addons disable volumesnapshots --alsologtostderr -v=1: (1.328592708s)
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-060437 addons disable csi-hostpath-driver --alsologtostderr -v=1
addons_test.go:1055: (dbg) Done: out/minikube-linux-arm64 -p addons-060437 addons disable csi-hostpath-driver --alsologtostderr -v=1: (7.087073373s)
--- PASS: TestAddons/parallel/CSI (53.35s)

                                                
                                    
x
+
TestAddons/parallel/Headlamp (16.82s)

                                                
                                                
=== RUN   TestAddons/parallel/Headlamp
=== PAUSE TestAddons/parallel/Headlamp

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:810: (dbg) Run:  out/minikube-linux-arm64 addons enable headlamp -p addons-060437 --alsologtostderr -v=1
addons_test.go:815: (dbg) TestAddons/parallel/Headlamp: waiting 8m0s for pods matching "app.kubernetes.io/name=headlamp" in namespace "headlamp" ...
helpers_test.go:353: "headlamp-dfcdc64b-sz8jq" [4ef5945c-818b-4f4a-b90a-e483f0f89acd] Pending
helpers_test.go:353: "headlamp-dfcdc64b-sz8jq" [4ef5945c-818b-4f4a-b90a-e483f0f89acd] Pending / Ready:ContainersNotReady (containers with unready status: [headlamp]) / ContainersReady:ContainersNotReady (containers with unready status: [headlamp])
helpers_test.go:353: "headlamp-dfcdc64b-sz8jq" [4ef5945c-818b-4f4a-b90a-e483f0f89acd] Running
addons_test.go:815: (dbg) TestAddons/parallel/Headlamp: app.kubernetes.io/name=headlamp healthy within 10.003727638s
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-060437 addons disable headlamp --alsologtostderr -v=1
addons_test.go:1055: (dbg) Done: out/minikube-linux-arm64 -p addons-060437 addons disable headlamp --alsologtostderr -v=1: (5.877874643s)
--- PASS: TestAddons/parallel/Headlamp (16.82s)

                                                
                                    
x
+
TestAddons/parallel/CloudSpanner (6.78s)

                                                
                                                
=== RUN   TestAddons/parallel/CloudSpanner
=== PAUSE TestAddons/parallel/CloudSpanner

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CloudSpanner
addons_test.go:842: (dbg) TestAddons/parallel/CloudSpanner: waiting 6m0s for pods matching "app=cloud-spanner-emulator" in namespace "default" ...
helpers_test.go:353: "cloud-spanner-emulator-5bdddb765-6bv8d" [8ded5f3b-596c-4ce5-a26c-4af25df6094d] Running
addons_test.go:842: (dbg) TestAddons/parallel/CloudSpanner: app=cloud-spanner-emulator healthy within 6.008745004s
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-060437 addons disable cloud-spanner --alsologtostderr -v=1
--- PASS: TestAddons/parallel/CloudSpanner (6.78s)

                                                
                                    
x
+
TestAddons/parallel/LocalPath (51.38s)

                                                
                                                
=== RUN   TestAddons/parallel/LocalPath
=== PAUSE TestAddons/parallel/LocalPath

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/LocalPath
addons_test.go:951: (dbg) Run:  kubectl --context addons-060437 apply -f testdata/storage-provisioner-rancher/pvc.yaml
addons_test.go:957: (dbg) Run:  kubectl --context addons-060437 apply -f testdata/storage-provisioner-rancher/pod.yaml
addons_test.go:961: (dbg) TestAddons/parallel/LocalPath: waiting 5m0s for pvc "test-pvc" in namespace "default" ...
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-060437 get pvc test-pvc -o jsonpath={.status.phase} -n default
addons_test.go:964: (dbg) TestAddons/parallel/LocalPath: waiting 3m0s for pods matching "run=test-local-path" in namespace "default" ...
helpers_test.go:353: "test-local-path" [e059ac59-7b37-482f-9ec3-e2f0d6f615ad] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:353: "test-local-path" [e059ac59-7b37-482f-9ec3-e2f0d6f615ad] Pending / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:353: "test-local-path" [e059ac59-7b37-482f-9ec3-e2f0d6f615ad] Succeeded / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
addons_test.go:964: (dbg) TestAddons/parallel/LocalPath: run=test-local-path healthy within 3.003805671s
addons_test.go:969: (dbg) Run:  kubectl --context addons-060437 get pvc test-pvc -o=json
addons_test.go:978: (dbg) Run:  out/minikube-linux-arm64 -p addons-060437 ssh "cat /opt/local-path-provisioner/pvc-83f081d6-ae60-4460-831a-8459b8400fce_default_test-pvc/file1"
addons_test.go:990: (dbg) Run:  kubectl --context addons-060437 delete pod test-local-path
addons_test.go:994: (dbg) Run:  kubectl --context addons-060437 delete pvc test-pvc
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-060437 addons disable storage-provisioner-rancher --alsologtostderr -v=1
addons_test.go:1055: (dbg) Done: out/minikube-linux-arm64 -p addons-060437 addons disable storage-provisioner-rancher --alsologtostderr -v=1: (43.188916998s)
--- PASS: TestAddons/parallel/LocalPath (51.38s)

                                                
                                    
x
+
TestAddons/parallel/NvidiaDevicePlugin (6.6s)

                                                
                                                
=== RUN   TestAddons/parallel/NvidiaDevicePlugin
=== PAUSE TestAddons/parallel/NvidiaDevicePlugin

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/NvidiaDevicePlugin
addons_test.go:1027: (dbg) TestAddons/parallel/NvidiaDevicePlugin: waiting 6m0s for pods matching "name=nvidia-device-plugin-ds" in namespace "kube-system" ...
helpers_test.go:353: "nvidia-device-plugin-daemonset-dchv5" [794ac0d5-a3a7-4bd3-8232-5a1cf8a43fe9] Running
addons_test.go:1027: (dbg) TestAddons/parallel/NvidiaDevicePlugin: name=nvidia-device-plugin-ds healthy within 6.004406458s
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-060437 addons disable nvidia-device-plugin --alsologtostderr -v=1
--- PASS: TestAddons/parallel/NvidiaDevicePlugin (6.60s)

                                                
                                    
x
+
TestAddons/parallel/Yakd (10.83s)

                                                
                                                
=== RUN   TestAddons/parallel/Yakd
=== PAUSE TestAddons/parallel/Yakd

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Yakd
addons_test.go:1049: (dbg) TestAddons/parallel/Yakd: waiting 2m0s for pods matching "app.kubernetes.io/name=yakd-dashboard" in namespace "yakd-dashboard" ...
helpers_test.go:353: "yakd-dashboard-6654c87f9b-vsqp7" [937aba38-43fc-407e-8bdd-733f474a6c5d] Running
addons_test.go:1049: (dbg) TestAddons/parallel/Yakd: app.kubernetes.io/name=yakd-dashboard healthy within 5.003414512s
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-060437 addons disable yakd --alsologtostderr -v=1
addons_test.go:1055: (dbg) Done: out/minikube-linux-arm64 -p addons-060437 addons disable yakd --alsologtostderr -v=1: (5.822312279s)
--- PASS: TestAddons/parallel/Yakd (10.83s)

                                                
                                    
x
+
TestAddons/StoppedEnableDisable (12.34s)

                                                
                                                
=== RUN   TestAddons/StoppedEnableDisable
addons_test.go:174: (dbg) Run:  out/minikube-linux-arm64 stop -p addons-060437
addons_test.go:174: (dbg) Done: out/minikube-linux-arm64 stop -p addons-060437: (12.05377665s)
addons_test.go:178: (dbg) Run:  out/minikube-linux-arm64 addons enable dashboard -p addons-060437
addons_test.go:182: (dbg) Run:  out/minikube-linux-arm64 addons disable dashboard -p addons-060437
addons_test.go:187: (dbg) Run:  out/minikube-linux-arm64 addons disable gvisor -p addons-060437
--- PASS: TestAddons/StoppedEnableDisable (12.34s)

                                                
                                    
x
+
TestCertOptions (37.81s)

                                                
                                                
=== RUN   TestCertOptions
=== PAUSE TestCertOptions

                                                
                                                

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Run:  out/minikube-linux-arm64 start -p cert-options-828380 --memory=3072 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=docker  --container-runtime=containerd
cert_options_test.go:49: (dbg) Done: out/minikube-linux-arm64 start -p cert-options-828380 --memory=3072 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=docker  --container-runtime=containerd: (34.848189503s)
cert_options_test.go:60: (dbg) Run:  out/minikube-linux-arm64 -p cert-options-828380 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt"
cert_options_test.go:88: (dbg) Run:  kubectl --context cert-options-828380 config view
cert_options_test.go:100: (dbg) Run:  out/minikube-linux-arm64 ssh -p cert-options-828380 -- "sudo cat /etc/kubernetes/admin.conf"
helpers_test.go:176: Cleaning up "cert-options-828380" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p cert-options-828380
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p cert-options-828380: (2.153645145s)
--- PASS: TestCertOptions (37.81s)

                                                
                                    
x
+
TestCertExpiration (234.46s)

                                                
                                                
=== RUN   TestCertExpiration
=== PAUSE TestCertExpiration

                                                
                                                

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Run:  out/minikube-linux-arm64 start -p cert-expiration-278380 --memory=3072 --cert-expiration=3m --driver=docker  --container-runtime=containerd
cert_options_test.go:123: (dbg) Done: out/minikube-linux-arm64 start -p cert-expiration-278380 --memory=3072 --cert-expiration=3m --driver=docker  --container-runtime=containerd: (44.20403653s)
cert_options_test.go:131: (dbg) Run:  out/minikube-linux-arm64 start -p cert-expiration-278380 --memory=3072 --cert-expiration=8760h --driver=docker  --container-runtime=containerd
cert_options_test.go:131: (dbg) Done: out/minikube-linux-arm64 start -p cert-expiration-278380 --memory=3072 --cert-expiration=8760h --driver=docker  --container-runtime=containerd: (7.626898342s)
helpers_test.go:176: Cleaning up "cert-expiration-278380" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p cert-expiration-278380
E1217 21:19:36.456423  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p cert-expiration-278380: (2.627973129s)
--- PASS: TestCertExpiration (234.46s)

                                                
                                    
x
+
TestForceSystemdFlag (48.17s)

                                                
                                                
=== RUN   TestForceSystemdFlag
=== PAUSE TestForceSystemdFlag

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:91: (dbg) Run:  out/minikube-linux-arm64 start -p force-systemd-flag-844765 --memory=3072 --force-systemd --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
docker_test.go:91: (dbg) Done: out/minikube-linux-arm64 start -p force-systemd-flag-844765 --memory=3072 --force-systemd --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (45.216990096s)
docker_test.go:121: (dbg) Run:  out/minikube-linux-arm64 -p force-systemd-flag-844765 ssh "cat /etc/containerd/config.toml"
helpers_test.go:176: Cleaning up "force-systemd-flag-844765" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p force-systemd-flag-844765
E1217 21:16:28.507510  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p force-systemd-flag-844765: (2.543269618s)
--- PASS: TestForceSystemdFlag (48.17s)

                                                
                                    
x
+
TestForceSystemdEnv (40.67s)

                                                
                                                
=== RUN   TestForceSystemdEnv
=== PAUSE TestForceSystemdEnv

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:155: (dbg) Run:  out/minikube-linux-arm64 start -p force-systemd-env-372844 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
docker_test.go:155: (dbg) Done: out/minikube-linux-arm64 start -p force-systemd-env-372844 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (37.915296244s)
docker_test.go:121: (dbg) Run:  out/minikube-linux-arm64 -p force-systemd-env-372844 ssh "cat /etc/containerd/config.toml"
helpers_test.go:176: Cleaning up "force-systemd-env-372844" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p force-systemd-env-372844
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p force-systemd-env-372844: (2.339780493s)
--- PASS: TestForceSystemdEnv (40.67s)

                                                
                                    
x
+
TestDockerEnvContainerd (47.41s)

                                                
                                                
=== RUN   TestDockerEnvContainerd
docker_test.go:170: running with containerd true linux arm64
docker_test.go:181: (dbg) Run:  out/minikube-linux-arm64 start -p dockerenv-568942 --driver=docker  --container-runtime=containerd
docker_test.go:181: (dbg) Done: out/minikube-linux-arm64 start -p dockerenv-568942 --driver=docker  --container-runtime=containerd: (31.843254155s)
docker_test.go:189: (dbg) Run:  /bin/bash -c "out/minikube-linux-arm64 docker-env --ssh-host --ssh-add -p dockerenv-568942"
docker_test.go:189: (dbg) Done: /bin/bash -c "out/minikube-linux-arm64 docker-env --ssh-host --ssh-add -p dockerenv-568942": (1.083588065s)
docker_test.go:220: (dbg) Run:  /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-SC1emEQkkE3K/agent.389639" SSH_AGENT_PID="389640" DOCKER_HOST=ssh://docker@127.0.0.1:33148 docker version"
docker_test.go:243: (dbg) Run:  /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-SC1emEQkkE3K/agent.389639" SSH_AGENT_PID="389640" DOCKER_HOST=ssh://docker@127.0.0.1:33148 DOCKER_BUILDKIT=0 docker build -t local/minikube-dockerenv-containerd-test:latest testdata/docker-env"
docker_test.go:243: (dbg) Done: /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-SC1emEQkkE3K/agent.389639" SSH_AGENT_PID="389640" DOCKER_HOST=ssh://docker@127.0.0.1:33148 DOCKER_BUILDKIT=0 docker build -t local/minikube-dockerenv-containerd-test:latest testdata/docker-env": (1.063357072s)
docker_test.go:250: (dbg) Run:  /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-SC1emEQkkE3K/agent.389639" SSH_AGENT_PID="389640" DOCKER_HOST=ssh://docker@127.0.0.1:33148 docker image ls"
helpers_test.go:176: Cleaning up "dockerenv-568942" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p dockerenv-568942
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p dockerenv-568942: (2.488321151s)
--- PASS: TestDockerEnvContainerd (47.41s)

                                                
                                    
x
+
TestErrorSpam/setup (31.94s)

                                                
                                                
=== RUN   TestErrorSpam/setup
error_spam_test.go:81: (dbg) Run:  out/minikube-linux-arm64 start -p nospam-238527 -n=1 --memory=3072 --wait=false --log_dir=/tmp/nospam-238527 --driver=docker  --container-runtime=containerd
error_spam_test.go:81: (dbg) Done: out/minikube-linux-arm64 start -p nospam-238527 -n=1 --memory=3072 --wait=false --log_dir=/tmp/nospam-238527 --driver=docker  --container-runtime=containerd: (31.939089384s)
--- PASS: TestErrorSpam/setup (31.94s)

                                                
                                    
x
+
TestErrorSpam/start (0.81s)

                                                
                                                
=== RUN   TestErrorSpam/start
error_spam_test.go:206: Cleaning up 1 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-238527 --log_dir /tmp/nospam-238527 start --dry-run
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-238527 --log_dir /tmp/nospam-238527 start --dry-run
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-238527 --log_dir /tmp/nospam-238527 start --dry-run
--- PASS: TestErrorSpam/start (0.81s)

                                                
                                    
x
+
TestErrorSpam/status (1.05s)

                                                
                                                
=== RUN   TestErrorSpam/status
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-238527 --log_dir /tmp/nospam-238527 status
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-238527 --log_dir /tmp/nospam-238527 status
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-238527 --log_dir /tmp/nospam-238527 status
--- PASS: TestErrorSpam/status (1.05s)

                                                
                                    
x
+
TestErrorSpam/pause (1.75s)

                                                
                                                
=== RUN   TestErrorSpam/pause
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-238527 --log_dir /tmp/nospam-238527 pause
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-238527 --log_dir /tmp/nospam-238527 pause
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-238527 --log_dir /tmp/nospam-238527 pause
--- PASS: TestErrorSpam/pause (1.75s)

                                                
                                    
x
+
TestErrorSpam/unpause (1.83s)

                                                
                                                
=== RUN   TestErrorSpam/unpause
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-238527 --log_dir /tmp/nospam-238527 unpause
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-238527 --log_dir /tmp/nospam-238527 unpause
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-238527 --log_dir /tmp/nospam-238527 unpause
--- PASS: TestErrorSpam/unpause (1.83s)

                                                
                                    
x
+
TestErrorSpam/stop (1.66s)

                                                
                                                
=== RUN   TestErrorSpam/stop
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-238527 --log_dir /tmp/nospam-238527 stop
error_spam_test.go:149: (dbg) Done: out/minikube-linux-arm64 -p nospam-238527 --log_dir /tmp/nospam-238527 stop: (1.459445188s)
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-238527 --log_dir /tmp/nospam-238527 stop
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-238527 --log_dir /tmp/nospam-238527 stop
--- PASS: TestErrorSpam/stop (1.66s)

                                                
                                    
x
+
TestFunctional/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctional/serial/CopySyncFile
functional_test.go:1860: local sync path: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts
--- PASS: TestFunctional/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctional/serial/StartWithProxy (48.26s)

                                                
                                                
=== RUN   TestFunctional/serial/StartWithProxy
functional_test.go:2239: (dbg) Run:  out/minikube-linux-arm64 start -p functional-032730 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd
E1217 20:14:49.023030  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:14:49.029698  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:14:49.041008  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:14:49.062328  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:14:49.103654  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:14:49.184995  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:14:49.346433  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:14:49.668025  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:14:50.309484  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:14:51.590737  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:14:54.152395  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:14:59.273769  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:15:09.515666  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:2239: (dbg) Done: out/minikube-linux-arm64 start -p functional-032730 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd: (48.26045286s)
--- PASS: TestFunctional/serial/StartWithProxy (48.26s)

                                                
                                    
x
+
TestFunctional/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctional/serial/AuditLog
--- PASS: TestFunctional/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctional/serial/SoftStart (7.22s)

                                                
                                                
=== RUN   TestFunctional/serial/SoftStart
I1217 20:15:12.291266  369461 config.go:182] Loaded profile config "functional-032730": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
functional_test.go:674: (dbg) Run:  out/minikube-linux-arm64 start -p functional-032730 --alsologtostderr -v=8
functional_test.go:674: (dbg) Done: out/minikube-linux-arm64 start -p functional-032730 --alsologtostderr -v=8: (7.220094083s)
functional_test.go:678: soft start took 7.221426567s for "functional-032730" cluster.
I1217 20:15:19.511760  369461 config.go:182] Loaded profile config "functional-032730": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
--- PASS: TestFunctional/serial/SoftStart (7.22s)

                                                
                                    
x
+
TestFunctional/serial/KubeContext (0.07s)

                                                
                                                
=== RUN   TestFunctional/serial/KubeContext
functional_test.go:696: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctional/serial/KubeContext (0.07s)

                                                
                                    
x
+
TestFunctional/serial/KubectlGetPods (0.1s)

                                                
                                                
=== RUN   TestFunctional/serial/KubectlGetPods
functional_test.go:711: (dbg) Run:  kubectl --context functional-032730 get po -A
--- PASS: TestFunctional/serial/KubectlGetPods (0.10s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_remote (3.58s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_remote
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 cache add registry.k8s.io/pause:3.1
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-032730 cache add registry.k8s.io/pause:3.1: (1.344236277s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 cache add registry.k8s.io/pause:3.3
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-032730 cache add registry.k8s.io/pause:3.3: (1.197016174s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 cache add registry.k8s.io/pause:latest
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-032730 cache add registry.k8s.io/pause:latest: (1.033803262s)
--- PASS: TestFunctional/serial/CacheCmd/cache/add_remote (3.58s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_local (1.33s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_local
functional_test.go:1092: (dbg) Run:  docker build -t minikube-local-cache-test:functional-032730 /tmp/TestFunctionalserialCacheCmdcacheadd_local1961350511/001
functional_test.go:1104: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 cache add minikube-local-cache-test:functional-032730
functional_test.go:1109: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 cache delete minikube-local-cache-test:functional-032730
functional_test.go:1098: (dbg) Run:  docker rmi minikube-local-cache-test:functional-032730
--- PASS: TestFunctional/serial/CacheCmd/cache/add_local (1.33s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/CacheDelete (0.07s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/CacheDelete
functional_test.go:1117: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctional/serial/CacheCmd/cache/CacheDelete (0.07s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/list (0.06s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/list
functional_test.go:1125: (dbg) Run:  out/minikube-linux-arm64 cache list
--- PASS: TestFunctional/serial/CacheCmd/cache/list (0.06s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.35s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1139: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh sudo crictl images
--- PASS: TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.35s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/cache_reload (1.88s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/cache_reload
functional_test.go:1162: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh sudo crictl rmi registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh sudo crictl inspecti registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-032730 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (289.163415ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1173: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 cache reload
functional_test.go:1178: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh sudo crictl inspecti registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/cache_reload (1.88s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete (0.12s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.1
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/delete (0.12s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmd (0.14s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmd
functional_test.go:731: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 kubectl -- --context functional-032730 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmd (0.14s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmdDirectly (0.14s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmdDirectly
functional_test.go:756: (dbg) Run:  out/kubectl --context functional-032730 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmdDirectly (0.14s)

                                                
                                    
x
+
TestFunctional/serial/ExtraConfig (51.4s)

                                                
                                                
=== RUN   TestFunctional/serial/ExtraConfig
functional_test.go:772: (dbg) Run:  out/minikube-linux-arm64 start -p functional-032730 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E1217 20:15:29.997490  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:16:10.958879  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:772: (dbg) Done: out/minikube-linux-arm64 start -p functional-032730 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: (51.395546585s)
functional_test.go:776: restart took 51.395675644s for "functional-032730" cluster.
I1217 20:16:18.740541  369461 config.go:182] Loaded profile config "functional-032730": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
--- PASS: TestFunctional/serial/ExtraConfig (51.40s)

                                                
                                    
x
+
TestFunctional/serial/ComponentHealth (0.09s)

                                                
                                                
=== RUN   TestFunctional/serial/ComponentHealth
functional_test.go:825: (dbg) Run:  kubectl --context functional-032730 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:840: etcd phase: Running
functional_test.go:850: etcd status: Ready
functional_test.go:840: kube-apiserver phase: Running
functional_test.go:850: kube-apiserver status: Ready
functional_test.go:840: kube-controller-manager phase: Running
functional_test.go:850: kube-controller-manager status: Ready
functional_test.go:840: kube-scheduler phase: Running
functional_test.go:850: kube-scheduler status: Ready
--- PASS: TestFunctional/serial/ComponentHealth (0.09s)

                                                
                                    
x
+
TestFunctional/serial/LogsCmd (1.47s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsCmd
functional_test.go:1251: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 logs
functional_test.go:1251: (dbg) Done: out/minikube-linux-arm64 -p functional-032730 logs: (1.466663414s)
--- PASS: TestFunctional/serial/LogsCmd (1.47s)

                                                
                                    
x
+
TestFunctional/serial/LogsFileCmd (1.5s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsFileCmd
functional_test.go:1265: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 logs --file /tmp/TestFunctionalserialLogsFileCmd1499875325/001/logs.txt
functional_test.go:1265: (dbg) Done: out/minikube-linux-arm64 -p functional-032730 logs --file /tmp/TestFunctionalserialLogsFileCmd1499875325/001/logs.txt: (1.495649714s)
--- PASS: TestFunctional/serial/LogsFileCmd (1.50s)

                                                
                                    
x
+
TestFunctional/serial/InvalidService (4.92s)

                                                
                                                
=== RUN   TestFunctional/serial/InvalidService
functional_test.go:2326: (dbg) Run:  kubectl --context functional-032730 apply -f testdata/invalidsvc.yaml
functional_test.go:2340: (dbg) Run:  out/minikube-linux-arm64 service invalid-svc -p functional-032730
functional_test.go:2340: (dbg) Non-zero exit: out/minikube-linux-arm64 service invalid-svc -p functional-032730: exit status 115 (468.180086ms)

                                                
                                                
-- stdout --
	┌───────────┬─────────────┬─────────────┬───────────────────────────┐
	│ NAMESPACE │    NAME     │ TARGET PORT │            URL            │
	├───────────┼─────────────┼─────────────┼───────────────────────────┤
	│ default   │ invalid-svc │ 80          │ http://192.168.49.2:30547 │
	└───────────┴─────────────┴─────────────┴───────────────────────────┘
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to SVC_UNREACHABLE: service not available: no running pod for service invalid-svc found
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_service_96b204199e3191fa1740d4430b018a3c8028d52d_0.log                 │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
functional_test.go:2332: (dbg) Run:  kubectl --context functional-032730 delete -f testdata/invalidsvc.yaml
functional_test.go:2332: (dbg) Done: kubectl --context functional-032730 delete -f testdata/invalidsvc.yaml: (1.202503411s)
--- PASS: TestFunctional/serial/InvalidService (4.92s)

                                                
                                    
x
+
TestFunctional/parallel/ConfigCmd (0.47s)

                                                
                                                
=== RUN   TestFunctional/parallel/ConfigCmd
=== PAUSE TestFunctional/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-032730 config get cpus: exit status 14 (82.723165ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 config set cpus 2
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 config get cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-032730 config get cpus: exit status 14 (70.154046ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/ConfigCmd (0.47s)

                                                
                                    
x
+
TestFunctional/parallel/DashboardCmd (10.35s)

                                                
                                                
=== RUN   TestFunctional/parallel/DashboardCmd
=== PAUSE TestFunctional/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:920: (dbg) daemon: [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-032730 --alsologtostderr -v=1]
functional_test.go:925: (dbg) stopping [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-032730 --alsologtostderr -v=1] ...
helpers_test.go:526: unable to kill pid 404902: os: process already finished
--- PASS: TestFunctional/parallel/DashboardCmd (10.35s)

                                                
                                    
x
+
TestFunctional/parallel/DryRun (0.49s)

                                                
                                                
=== RUN   TestFunctional/parallel/DryRun
=== PAUSE TestFunctional/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DryRun
functional_test.go:989: (dbg) Run:  out/minikube-linux-arm64 start -p functional-032730 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd
functional_test.go:989: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-032730 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd: exit status 23 (209.975174ms)

                                                
                                                
-- stdout --
	* [functional-032730] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21808
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1217 20:16:57.567830  404603 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:16:57.568008  404603 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:16:57.568040  404603 out.go:374] Setting ErrFile to fd 2...
	I1217 20:16:57.568062  404603 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:16:57.568465  404603 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:16:57.569454  404603 out.go:368] Setting JSON to false
	I1217 20:16:57.570665  404603 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":10763,"bootTime":1765991855,"procs":205,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:16:57.570784  404603 start.go:143] virtualization:  
	I1217 20:16:57.574140  404603 out.go:179] * [functional-032730] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1217 20:16:57.577157  404603 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 20:16:57.577322  404603 notify.go:221] Checking for updates...
	I1217 20:16:57.582722  404603 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:16:57.586283  404603 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:16:57.589869  404603 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:16:57.594329  404603 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 20:16:57.597265  404603 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 20:16:57.600668  404603 config.go:182] Loaded profile config "functional-032730": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1217 20:16:57.601293  404603 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:16:57.633559  404603 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:16:57.633691  404603 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:16:57.705697  404603 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:53 SystemTime:2025-12-17 20:16:57.687198916 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:16:57.705814  404603 docker.go:319] overlay module found
	I1217 20:16:57.708881  404603 out.go:179] * Using the docker driver based on existing profile
	I1217 20:16:57.711753  404603 start.go:309] selected driver: docker
	I1217 20:16:57.711788  404603 start.go:927] validating driver "docker" against &{Name:functional-032730 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-032730 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOpt
ions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:16:57.711880  404603 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 20:16:57.715510  404603 out.go:203] 
	W1217 20:16:57.718405  404603 out.go:285] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I1217 20:16:57.721303  404603 out.go:203] 

                                                
                                                
** /stderr **
functional_test.go:1006: (dbg) Run:  out/minikube-linux-arm64 start -p functional-032730 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
--- PASS: TestFunctional/parallel/DryRun (0.49s)

                                                
                                    
x
+
TestFunctional/parallel/InternationalLanguage (0.21s)

                                                
                                                
=== RUN   TestFunctional/parallel/InternationalLanguage
=== PAUSE TestFunctional/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/InternationalLanguage
functional_test.go:1035: (dbg) Run:  out/minikube-linux-arm64 start -p functional-032730 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd
functional_test.go:1035: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-032730 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd: exit status 23 (214.000361ms)

                                                
                                                
-- stdout --
	* [functional-032730] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21808
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote docker basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1217 20:16:57.365819  404556 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:16:57.365966  404556 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:16:57.365978  404556 out.go:374] Setting ErrFile to fd 2...
	I1217 20:16:57.365983  404556 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:16:57.367801  404556 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:16:57.368375  404556 out.go:368] Setting JSON to false
	I1217 20:16:57.370456  404556 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":10763,"bootTime":1765991855,"procs":205,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:16:57.370542  404556 start.go:143] virtualization:  
	I1217 20:16:57.374260  404556 out.go:179] * [functional-032730] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	I1217 20:16:57.377172  404556 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 20:16:57.377325  404556 notify.go:221] Checking for updates...
	I1217 20:16:57.383370  404556 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:16:57.386442  404556 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:16:57.389253  404556 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:16:57.392205  404556 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 20:16:57.395073  404556 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 20:16:57.398590  404556 config.go:182] Loaded profile config "functional-032730": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1217 20:16:57.399282  404556 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:16:57.431049  404556 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:16:57.431208  404556 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:16:57.495867  404556 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:53 SystemTime:2025-12-17 20:16:57.485975768 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:16:57.495980  404556 docker.go:319] overlay module found
	I1217 20:16:57.499231  404556 out.go:179] * Utilisation du pilote docker basé sur le profil existant
	I1217 20:16:57.502154  404556 start.go:309] selected driver: docker
	I1217 20:16:57.502185  404556 start.go:927] validating driver "docker" against &{Name:functional-032730 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-032730 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOpt
ions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:16:57.502303  404556 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 20:16:57.505873  404556 out.go:203] 
	W1217 20:16:57.508897  404556 out.go:285] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I1217 20:16:57.511849  404556 out.go:203] 

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/InternationalLanguage (0.21s)

                                                
                                    
x
+
TestFunctional/parallel/StatusCmd (1.1s)

                                                
                                                
=== RUN   TestFunctional/parallel/StatusCmd
=== PAUSE TestFunctional/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/StatusCmd
functional_test.go:869: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 status
functional_test.go:875: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:887: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 status -o json
--- PASS: TestFunctional/parallel/StatusCmd (1.10s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmdConnect (7.73s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmdConnect
=== PAUSE TestFunctional/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1636: (dbg) Run:  kubectl --context functional-032730 create deployment hello-node-connect --image kicbase/echo-server
functional_test.go:1640: (dbg) Run:  kubectl --context functional-032730 expose deployment hello-node-connect --type=NodePort --port=8080
functional_test.go:1645: (dbg) TestFunctional/parallel/ServiceCmdConnect: waiting 10m0s for pods matching "app=hello-node-connect" in namespace "default" ...
helpers_test.go:353: "hello-node-connect-7d85dfc575-bbkrj" [e4db4e03-5e69-4ba7-a0c3-c81dc7b29685] Pending / Ready:ContainersNotReady (containers with unready status: [echo-server]) / ContainersReady:ContainersNotReady (containers with unready status: [echo-server])
helpers_test.go:353: "hello-node-connect-7d85dfc575-bbkrj" [e4db4e03-5e69-4ba7-a0c3-c81dc7b29685] Running
functional_test.go:1645: (dbg) TestFunctional/parallel/ServiceCmdConnect: app=hello-node-connect healthy within 7.003685666s
functional_test.go:1654: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 service hello-node-connect --url
functional_test.go:1660: found endpoint for hello-node-connect: http://192.168.49.2:31064
functional_test.go:1680: http://192.168.49.2:31064: success! body:
Request served by hello-node-connect-7d85dfc575-bbkrj

                                                
                                                
HTTP/1.1 GET /

                                                
                                                
Host: 192.168.49.2:31064
Accept-Encoding: gzip
User-Agent: Go-http-client/1.1
--- PASS: TestFunctional/parallel/ServiceCmdConnect (7.73s)

                                                
                                    
x
+
TestFunctional/parallel/AddonsCmd (0.13s)

                                                
                                                
=== RUN   TestFunctional/parallel/AddonsCmd
=== PAUSE TestFunctional/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/AddonsCmd
functional_test.go:1695: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 addons list
functional_test.go:1707: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 addons list -o json
--- PASS: TestFunctional/parallel/AddonsCmd (0.13s)

                                                
                                    
x
+
TestFunctional/parallel/PersistentVolumeClaim (22.96s)

                                                
                                                
=== RUN   TestFunctional/parallel/PersistentVolumeClaim
=== PAUSE TestFunctional/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:50: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:353: "storage-provisioner" [216b53e5-ce25-4c7b-b811-2f81f7657797] Running
functional_test_pvc_test.go:50: (dbg) TestFunctional/parallel/PersistentVolumeClaim: integration-test=storage-provisioner healthy within 6.00333125s
functional_test_pvc_test.go:55: (dbg) Run:  kubectl --context functional-032730 get storageclass -o=json
functional_test_pvc_test.go:75: (dbg) Run:  kubectl --context functional-032730 apply -f testdata/storage-provisioner/pvc.yaml
functional_test_pvc_test.go:82: (dbg) Run:  kubectl --context functional-032730 get pvc myclaim -o=json
functional_test_pvc_test.go:131: (dbg) Run:  kubectl --context functional-032730 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:353: "sp-pod" [490362eb-042b-4f6e-a217-29141b751816] Pending
helpers_test.go:353: "sp-pod" [490362eb-042b-4f6e-a217-29141b751816] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:353: "sp-pod" [490362eb-042b-4f6e-a217-29141b751816] Running
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 8.003132382s
functional_test_pvc_test.go:106: (dbg) Run:  kubectl --context functional-032730 exec sp-pod -- touch /tmp/mount/foo
functional_test_pvc_test.go:112: (dbg) Run:  kubectl --context functional-032730 delete -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:131: (dbg) Run:  kubectl --context functional-032730 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:353: "sp-pod" [177bc998-65cc-4c72-a079-52eff32da383] Pending
helpers_test.go:353: "sp-pod" [177bc998-65cc-4c72-a079-52eff32da383] Running
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 7.00366222s
functional_test_pvc_test.go:120: (dbg) Run:  kubectl --context functional-032730 exec sp-pod -- ls /tmp/mount
--- PASS: TestFunctional/parallel/PersistentVolumeClaim (22.96s)

                                                
                                    
x
+
TestFunctional/parallel/SSHCmd (0.71s)

                                                
                                                
=== RUN   TestFunctional/parallel/SSHCmd
=== PAUSE TestFunctional/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1730: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "echo hello"
functional_test.go:1747: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "cat /etc/hostname"
--- PASS: TestFunctional/parallel/SSHCmd (0.71s)

                                                
                                    
x
+
TestFunctional/parallel/CpCmd (2.47s)

                                                
                                                
=== RUN   TestFunctional/parallel/CpCmd
=== PAUSE TestFunctional/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh -n functional-032730 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 cp functional-032730:/home/docker/cp-test.txt /tmp/TestFunctionalparallelCpCmd3557010762/001/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh -n functional-032730 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh -n functional-032730 "sudo cat /tmp/does/not/exist/cp-test.txt"
--- PASS: TestFunctional/parallel/CpCmd (2.47s)

                                                
                                    
x
+
TestFunctional/parallel/FileSync (0.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/FileSync
=== PAUSE TestFunctional/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/FileSync
functional_test.go:1934: Checking for existence of /etc/test/nested/copy/369461/hosts within VM
functional_test.go:1936: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "sudo cat /etc/test/nested/copy/369461/hosts"
functional_test.go:1941: file sync test content: Test file for checking file sync process
--- PASS: TestFunctional/parallel/FileSync (0.37s)

                                                
                                    
x
+
TestFunctional/parallel/CertSync (2.21s)

                                                
                                                
=== RUN   TestFunctional/parallel/CertSync
=== PAUSE TestFunctional/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1977: Checking for existence of /etc/ssl/certs/369461.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "sudo cat /etc/ssl/certs/369461.pem"
functional_test.go:1977: Checking for existence of /usr/share/ca-certificates/369461.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "sudo cat /usr/share/ca-certificates/369461.pem"
functional_test.go:1977: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/3694612.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "sudo cat /etc/ssl/certs/3694612.pem"
functional_test.go:2004: Checking for existence of /usr/share/ca-certificates/3694612.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "sudo cat /usr/share/ca-certificates/3694612.pem"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctional/parallel/CertSync (2.21s)

                                                
                                    
x
+
TestFunctional/parallel/NodeLabels (0.11s)

                                                
                                                
=== RUN   TestFunctional/parallel/NodeLabels
=== PAUSE TestFunctional/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NodeLabels
functional_test.go:234: (dbg) Run:  kubectl --context functional-032730 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
--- PASS: TestFunctional/parallel/NodeLabels (0.11s)

                                                
                                    
x
+
TestFunctional/parallel/NonActiveRuntimeDisabled (0.56s)

                                                
                                                
=== RUN   TestFunctional/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctional/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "sudo systemctl is-active docker"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-032730 ssh "sudo systemctl is-active docker": exit status 1 (280.272069ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "sudo systemctl is-active crio"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-032730 ssh "sudo systemctl is-active crio": exit status 1 (283.855832ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/NonActiveRuntimeDisabled (0.56s)

                                                
                                    
x
+
TestFunctional/parallel/License (0.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/License
=== PAUSE TestFunctional/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/License
functional_test.go:2293: (dbg) Run:  out/minikube-linux-arm64 license
--- PASS: TestFunctional/parallel/License (0.25s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.61s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-032730 tunnel --alsologtostderr]
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-032730 tunnel --alsologtostderr]
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-032730 tunnel --alsologtostderr] ...
helpers_test.go:526: unable to kill pid 402113: os: process already finished
helpers_test.go:520: unable to terminate pid 401929: os: process already finished
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-032730 tunnel --alsologtostderr] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.61s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:129: (dbg) daemon: [out/minikube-linux-arm64 -p functional-032730 tunnel --alsologtostderr]
--- PASS: TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (8.47s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:212: (dbg) Run:  kubectl --context functional-032730 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: waiting 4m0s for pods matching "run=nginx-svc" in namespace "default" ...
helpers_test.go:353: "nginx-svc" [cd04d8d0-7bff-480c-bd63-96dd61a50466] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:353: "nginx-svc" [cd04d8d0-7bff-480c-bd63-96dd61a50466] Running
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: run=nginx-svc healthy within 8.004290413s
I1217 20:16:36.976860  369461 kapi.go:150] Service nginx-svc in namespace default found.
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (8.47s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.09s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP
functional_test_tunnel_test.go:234: (dbg) Run:  kubectl --context functional-032730 get svc nginx-svc -o jsonpath={.status.loadBalancer.ingress[0].ip}
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.09s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:299: tunnel at http://10.106.1.178 is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.11s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:434: (dbg) stopping [out/minikube-linux-arm64 -p functional-032730 tunnel --alsologtostderr] ...
functional_test_tunnel_test.go:437: failed to stop process: signal: terminated
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.11s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/DeployApp (8.38s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/DeployApp
functional_test.go:1451: (dbg) Run:  kubectl --context functional-032730 create deployment hello-node --image kicbase/echo-server
functional_test.go:1455: (dbg) Run:  kubectl --context functional-032730 expose deployment hello-node --type=NodePort --port=8080
functional_test.go:1460: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: waiting 10m0s for pods matching "app=hello-node" in namespace "default" ...
helpers_test.go:353: "hello-node-75c85bcc94-7n2qk" [73bbf828-339f-477d-8ab3-c3fb49dd711f] Pending
helpers_test.go:353: "hello-node-75c85bcc94-7n2qk" [73bbf828-339f-477d-8ab3-c3fb49dd711f] Pending / Ready:ContainersNotReady (containers with unready status: [echo-server]) / ContainersReady:ContainersNotReady (containers with unready status: [echo-server])
helpers_test.go:353: "hello-node-75c85bcc94-7n2qk" [73bbf828-339f-477d-8ab3-c3fb49dd711f] Running
functional_test.go:1460: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: app=hello-node healthy within 8.004022747s
--- PASS: TestFunctional/parallel/ServiceCmd/DeployApp (8.38s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_not_create (0.45s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_not_create
functional_test.go:1285: (dbg) Run:  out/minikube-linux-arm64 profile lis
functional_test.go:1290: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestFunctional/parallel/ProfileCmd/profile_not_create (0.45s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_list (0.42s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_list
functional_test.go:1325: (dbg) Run:  out/minikube-linux-arm64 profile list
functional_test.go:1330: Took "357.804098ms" to run "out/minikube-linux-arm64 profile list"
functional_test.go:1339: (dbg) Run:  out/minikube-linux-arm64 profile list -l
functional_test.go:1344: Took "62.38344ms" to run "out/minikube-linux-arm64 profile list -l"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_list (0.42s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_json_output (0.43s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_json_output
functional_test.go:1376: (dbg) Run:  out/minikube-linux-arm64 profile list -o json
functional_test.go:1381: Took "361.629399ms" to run "out/minikube-linux-arm64 profile list -o json"
functional_test.go:1389: (dbg) Run:  out/minikube-linux-arm64 profile list -o json --light
functional_test.go:1394: Took "70.038964ms" to run "out/minikube-linux-arm64 profile list -o json --light"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_json_output (0.43s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/List (0.74s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/List
functional_test.go:1469: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 service list
--- PASS: TestFunctional/parallel/ServiceCmd/List (0.74s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/any-port (8.71s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:73: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-032730 /tmp/TestFunctionalparallelMountCmdany-port1650584952/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:107: wrote "test-1766002613443310040" to /tmp/TestFunctionalparallelMountCmdany-port1650584952/001/created-by-test
functional_test_mount_test.go:107: wrote "test-1766002613443310040" to /tmp/TestFunctionalparallelMountCmdany-port1650584952/001/created-by-test-removed-by-pod
functional_test_mount_test.go:107: wrote "test-1766002613443310040" to /tmp/TestFunctionalparallelMountCmdany-port1650584952/001/test-1766002613443310040
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:115: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-032730 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (499.604579ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1217 20:16:53.944469  369461 retry.go:31] will retry after 635.889252ms: exit status 1
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:129: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh -- ls -la /mount-9p
functional_test_mount_test.go:133: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Dec 17 20:16 created-by-test
-rw-r--r-- 1 docker docker 24 Dec 17 20:16 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Dec 17 20:16 test-1766002613443310040
functional_test_mount_test.go:137: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh cat /mount-9p/test-1766002613443310040
functional_test_mount_test.go:148: (dbg) Run:  kubectl --context functional-032730 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: waiting 4m0s for pods matching "integration-test=busybox-mount" in namespace "default" ...
helpers_test.go:353: "busybox-mount" [e3f98ba9-7db0-479f-9da3-e5c3a7563a6b] Pending
helpers_test.go:353: "busybox-mount" [e3f98ba9-7db0-479f-9da3-e5c3a7563a6b] Pending / Ready:ContainersNotReady (containers with unready status: [mount-munger]) / ContainersReady:ContainersNotReady (containers with unready status: [mount-munger])
helpers_test.go:353: "busybox-mount" [e3f98ba9-7db0-479f-9da3-e5c3a7563a6b] Pending / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:353: "busybox-mount" [e3f98ba9-7db0-479f-9da3-e5c3a7563a6b] Succeeded / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: integration-test=busybox-mount healthy within 5.00395899s
functional_test_mount_test.go:169: (dbg) Run:  kubectl --context functional-032730 logs busybox-mount
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh stat /mount-9p/created-by-test
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh stat /mount-9p/created-by-pod
functional_test_mount_test.go:90: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:94: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-032730 /tmp/TestFunctionalparallelMountCmdany-port1650584952/001:/mount-9p --alsologtostderr -v=1] ...
--- PASS: TestFunctional/parallel/MountCmd/any-port (8.71s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/JSONOutput (0.52s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/JSONOutput
functional_test.go:1499: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 service list -o json
functional_test.go:1504: Took "521.894041ms" to run "out/minikube-linux-arm64 -p functional-032730 service list -o json"
--- PASS: TestFunctional/parallel/ServiceCmd/JSONOutput (0.52s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/HTTPS (0.54s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/HTTPS
functional_test.go:1519: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 service --namespace=default --https --url hello-node
functional_test.go:1532: found endpoint: https://192.168.49.2:31773
--- PASS: TestFunctional/parallel/ServiceCmd/HTTPS (0.54s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/Format (0.53s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/Format
functional_test.go:1550: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 service hello-node --url --format={{.IP}}
--- PASS: TestFunctional/parallel/ServiceCmd/Format (0.53s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/URL (0.47s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/URL
functional_test.go:1569: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 service hello-node --url
functional_test.go:1575: found endpoint for hello-node: http://192.168.49.2:31773
--- PASS: TestFunctional/parallel/ServiceCmd/URL (0.47s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/specific-port (2.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/specific-port
functional_test_mount_test.go:213: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-032730 /tmp/TestFunctionalparallelMountCmdspecific-port4182359915/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-032730 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (381.846038ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1217 20:17:02.532843  369461 retry.go:31] will retry after 266.356025ms: exit status 1
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:257: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh -- ls -la /mount-9p
functional_test_mount_test.go:261: guest mount directory contents
total 0
functional_test_mount_test.go:263: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-032730 /tmp/TestFunctionalparallelMountCmdspecific-port4182359915/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:264: reading mount text
functional_test_mount_test.go:278: done reading mount text
functional_test_mount_test.go:230: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:230: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-032730 ssh "sudo umount -f /mount-9p": exit status 1 (351.132997ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:232: "out/minikube-linux-arm64 -p functional-032730 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:234: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-032730 /tmp/TestFunctionalparallelMountCmdspecific-port4182359915/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctional/parallel/MountCmd/specific-port (2.01s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/VerifyCleanup (2.77s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-032730 /tmp/TestFunctionalparallelMountCmdVerifyCleanup633149461/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-032730 /tmp/TestFunctionalparallelMountCmdVerifyCleanup633149461/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-032730 /tmp/TestFunctionalparallelMountCmdVerifyCleanup633149461/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-032730 ssh "findmnt -T" /mount1: exit status 1 (1.07720307s)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1217 20:17:05.237548  369461 retry.go:31] will retry after 514.887275ms: exit status 1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "findmnt -T" /mount2
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh "findmnt -T" /mount3
functional_test_mount_test.go:370: (dbg) Run:  out/minikube-linux-arm64 mount -p functional-032730 --kill=true
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-032730 /tmp/TestFunctionalparallelMountCmdVerifyCleanup633149461/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-032730 /tmp/TestFunctionalparallelMountCmdVerifyCleanup633149461/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-032730 /tmp/TestFunctionalparallelMountCmdVerifyCleanup633149461/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/MountCmd/VerifyCleanup (2.77s)

                                                
                                    
x
+
TestFunctional/parallel/Version/short (0.07s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/short
=== PAUSE TestFunctional/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/short
functional_test.go:2261: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 version --short
--- PASS: TestFunctional/parallel/Version/short (0.07s)

                                                
                                    
x
+
TestFunctional/parallel/Version/components (1.38s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/components
=== PAUSE TestFunctional/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/components
functional_test.go:2275: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 version -o=json --components
functional_test.go:2275: (dbg) Done: out/minikube-linux-arm64 -p functional-032730 version -o=json --components: (1.384822707s)
--- PASS: TestFunctional/parallel/Version/components (1.38s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListShort (0.3s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListShort
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 image ls --format short --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-032730 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.10.1
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.34.3
registry.k8s.io/kube-proxy:v1.34.3
registry.k8s.io/kube-controller-manager:v1.34.3
registry.k8s.io/kube-apiserver:v1.34.3
registry.k8s.io/etcd:3.6.5-0
registry.k8s.io/coredns/coredns:v1.12.1
public.ecr.aws/nginx/nginx:alpine
gcr.io/k8s-minikube/storage-provisioner:v5
gcr.io/k8s-minikube/busybox:1.28.4-glibc
docker.io/library/minikube-local-cache-test:functional-032730
docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88
docker.io/kindest/kindnetd:v20250512-df8de77b
docker.io/kicbase/echo-server:functional-032730
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-032730 image ls --format short --alsologtostderr:
I1217 20:17:14.418444  407678 out.go:360] Setting OutFile to fd 1 ...
I1217 20:17:14.418600  407678 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:17:14.418628  407678 out.go:374] Setting ErrFile to fd 2...
I1217 20:17:14.418644  407678 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:17:14.418934  407678 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
I1217 20:17:14.419605  407678 config.go:182] Loaded profile config "functional-032730": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1217 20:17:14.419768  407678 config.go:182] Loaded profile config "functional-032730": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1217 20:17:14.420396  407678 cli_runner.go:164] Run: docker container inspect functional-032730 --format={{.State.Status}}
I1217 20:17:14.444904  407678 ssh_runner.go:195] Run: systemctl --version
I1217 20:17:14.444954  407678 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-032730
I1217 20:17:14.466956  407678 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33158 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-032730/id_rsa Username:docker}
I1217 20:17:14.564593  407678 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListShort (0.30s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListTable (0.29s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListTable
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 image ls --format table --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-032730 image ls --format table --alsologtostderr:
┌─────────────────────────────────────────────┬───────────────────────────────────────┬───────────────┬────────┐
│                    IMAGE                    │                  TAG                  │   IMAGE ID    │  SIZE  │
├─────────────────────────────────────────────┼───────────────────────────────────────┼───────────────┼────────┤
│ docker.io/kicbase/echo-server               │ functional-032730                     │ sha256:ce2d2c │ 2.17MB │
│ docker.io/kindest/kindnetd                  │ v20251212-v0.29.0-alpha-105-g20ccfc88 │ sha256:c96ee3 │ 38.5MB │
│ registry.k8s.io/kube-apiserver              │ v1.34.3                               │ sha256:cf65ae │ 24.6MB │
│ public.ecr.aws/nginx/nginx                  │ alpine                                │ sha256:10afed │ 23MB   │
│ registry.k8s.io/coredns/coredns             │ v1.12.1                               │ sha256:138784 │ 20.4MB │
│ registry.k8s.io/kube-controller-manager     │ v1.34.3                               │ sha256:7ada8f │ 20.7MB │
│ registry.k8s.io/kube-scheduler              │ v1.34.3                               │ sha256:2f2aa2 │ 15.8MB │
│ registry.k8s.io/pause                       │ 3.3                                   │ sha256:3d1873 │ 249kB  │
│ docker.io/library/minikube-local-cache-test │ functional-032730                     │ sha256:05258a │ 992B   │
│ gcr.io/k8s-minikube/busybox                 │ 1.28.4-glibc                          │ sha256:1611cd │ 1.94MB │
│ registry.k8s.io/etcd                        │ 3.6.5-0                               │ sha256:2c5f0d │ 21.1MB │
│ registry.k8s.io/kube-proxy                  │ v1.34.3                               │ sha256:4461da │ 22.8MB │
│ docker.io/kindest/kindnetd                  │ v20250512-df8de77b                    │ sha256:b1a8c6 │ 40.6MB │
│ gcr.io/k8s-minikube/storage-provisioner     │ v5                                    │ sha256:ba04bb │ 8.03MB │
│ registry.k8s.io/pause                       │ 3.1                                   │ sha256:8057e0 │ 262kB  │
│ registry.k8s.io/pause                       │ 3.10.1                                │ sha256:d7b100 │ 268kB  │
│ registry.k8s.io/pause                       │ latest                                │ sha256:8cb209 │ 71.3kB │
└─────────────────────────────────────────────┴───────────────────────────────────────┴───────────────┴────────┘
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-032730 image ls --format table --alsologtostderr:
I1217 20:17:14.733764  407756 out.go:360] Setting OutFile to fd 1 ...
I1217 20:17:14.733903  407756 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:17:14.733911  407756 out.go:374] Setting ErrFile to fd 2...
I1217 20:17:14.733917  407756 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:17:14.734190  407756 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
I1217 20:17:14.734931  407756 config.go:182] Loaded profile config "functional-032730": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1217 20:17:14.735097  407756 config.go:182] Loaded profile config "functional-032730": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1217 20:17:14.735709  407756 cli_runner.go:164] Run: docker container inspect functional-032730 --format={{.State.Status}}
I1217 20:17:14.756742  407756 ssh_runner.go:195] Run: systemctl --version
I1217 20:17:14.756799  407756 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-032730
I1217 20:17:14.776718  407756 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33158 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-032730/id_rsa Username:docker}
I1217 20:17:14.873396  407756 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListTable (0.29s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListJson (0.27s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListJson
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 image ls --format json --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-032730 image ls --format json --alsologtostderr:
[{"id":"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.1"],"size":"262191"},{"id":"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd","repoDigests":["registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"],"repoTags":["registry.k8s.io/pause:3.10.1"],"size":"267939"},{"id":"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17","repoDigests":[],"repoTags":["docker.io/kicbase/echo-server:functional-032730"],"size":"2173567"},{"id":"sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c","repoDigests":["docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"],"repoTags":["docker.io/kindest/kindnetd:v20250512-df8de77b"],"size":"40636774"},{"id":"sha256:c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13","repoDigests":["docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b5
1cd57bdce1589940df856105384ac7f753a1ab43ae"],"repoTags":["docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88"],"size":"38502448"},{"id":"sha256:10afed3caf3eed1b711b8fa0a9600a7b488a45653a15a598a47ac570c1204cc4","repoDigests":["public.ecr.aws/nginx/nginx@sha256:9b0f84d48f92f2147217aec522219e9eda883a2836f1e30ab1915bd794f294ff"],"repoTags":["public.ecr.aws/nginx/nginx:alpine"],"size":"22985759"},{"id":"sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.3"],"size":"249461"},{"id":"sha256:05258a74f07dd17944d5b57da11e1219f05ceba6a54a10e2544b7da8ff43103b","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-032730"],"size":"992"},{"id":"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42","repoDigests":["registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534"],"repoTags":["registry.k8s.io/etcd:3.6.5-0"],"size":"21136588"},{"id":"sha256:cf65
ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896","repoDigests":["registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460"],"repoTags":["registry.k8s.io/kube-apiserver:v1.34.3"],"size":"24567639"},{"id":"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22","repoDigests":["registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954"],"repoTags":["registry.k8s.io/kube-controller-manager:v1.34.3"],"size":"20719958"},{"id":"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162","repoDigests":["registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6"],"repoTags":["registry.k8s.io/kube-proxy:v1.34.3"],"size":"22804272"},{"id":"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6","repoDigests":["registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2"],"repoTags
":["registry.k8s.io/kube-scheduler:v1.34.3"],"size":"15776215"},{"id":"sha256:20b332c9a70d8516d849d1ac23eff5800cbb2f263d379f0ec11ee908db6b25a8","repoDigests":["docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93"],"repoTags":[],"size":"74084559"},{"id":"sha256:a422e0e982356f6c1cf0e5bb7b733363caae3992a07c99951fbcc73e58ed656a","repoDigests":["docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c"],"repoTags":[],"size":"18306114"},{"id":"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a","repoDigests":[],"repoTags":["registry.k8s.io/pause:latest"],"size":"71300"},{"id":"sha256:1611cd07b61d57dbbfebe6db242513fd51e1c02d20ba08af17a45837d86a8a8c","repoDigests":["gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e"],"repoTags":["gcr.io/k8s-minikube/busybox:1.28.4-glibc"],"size":"1935750"},{"id":"sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2
e1fd21d28319c348c33fbcde6","repoDigests":["gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"8034419"},{"id":"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc","repoDigests":["registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c"],"repoTags":["registry.k8s.io/coredns/coredns:v1.12.1"],"size":"20392204"}]
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-032730 image ls --format json --alsologtostderr:
I1217 20:17:14.686865  407750 out.go:360] Setting OutFile to fd 1 ...
I1217 20:17:14.687065  407750 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:17:14.687074  407750 out.go:374] Setting ErrFile to fd 2...
I1217 20:17:14.687080  407750 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:17:14.687368  407750 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
I1217 20:17:14.688088  407750 config.go:182] Loaded profile config "functional-032730": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1217 20:17:14.688304  407750 config.go:182] Loaded profile config "functional-032730": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1217 20:17:14.688878  407750 cli_runner.go:164] Run: docker container inspect functional-032730 --format={{.State.Status}}
I1217 20:17:14.709086  407750 ssh_runner.go:195] Run: systemctl --version
I1217 20:17:14.709146  407750 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-032730
I1217 20:17:14.743687  407750 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33158 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-032730/id_rsa Username:docker}
I1217 20:17:14.843422  407750 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListJson (0.27s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListYaml (0.27s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListYaml
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 image ls --format yaml --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-032730 image ls --format yaml --alsologtostderr:
- id: sha256:1611cd07b61d57dbbfebe6db242513fd51e1c02d20ba08af17a45837d86a8a8c
repoDigests:
- gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e
repoTags:
- gcr.io/k8s-minikube/busybox:1.28.4-glibc
size: "1935750"
- id: sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896
repoDigests:
- registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460
repoTags:
- registry.k8s.io/kube-apiserver:v1.34.3
size: "24567639"
- id: sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162
repoDigests:
- registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6
repoTags:
- registry.k8s.io/kube-proxy:v1.34.3
size: "22804272"
- id: sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a
repoDigests: []
repoTags:
- registry.k8s.io/pause:latest
size: "71300"
- id: sha256:05258a74f07dd17944d5b57da11e1219f05ceba6a54a10e2544b7da8ff43103b
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-032730
size: "992"
- id: sha256:20b332c9a70d8516d849d1ac23eff5800cbb2f263d379f0ec11ee908db6b25a8
repoDigests:
- docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93
repoTags: []
size: "74084559"
- id: sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc
repoDigests:
- registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c
repoTags:
- registry.k8s.io/coredns/coredns:v1.12.1
size: "20392204"
- id: sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22
repoDigests:
- registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954
repoTags:
- registry.k8s.io/kube-controller-manager:v1.34.3
size: "20719958"
- id: sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.1
size: "262191"
- id: sha256:a422e0e982356f6c1cf0e5bb7b733363caae3992a07c99951fbcc73e58ed656a
repoDigests:
- docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c
repoTags: []
size: "18306114"
- id: sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6
repoDigests:
- registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2
repoTags:
- registry.k8s.io/kube-scheduler:v1.34.3
size: "15776215"
- id: sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17
repoDigests: []
repoTags:
- docker.io/kicbase/echo-server:functional-032730
size: "2173567"
- id: sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6
repoDigests:
- gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "8034419"
- id: sha256:10afed3caf3eed1b711b8fa0a9600a7b488a45653a15a598a47ac570c1204cc4
repoDigests:
- public.ecr.aws/nginx/nginx@sha256:9b0f84d48f92f2147217aec522219e9eda883a2836f1e30ab1915bd794f294ff
repoTags:
- public.ecr.aws/nginx/nginx:alpine
size: "22985759"
- id: sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42
repoDigests:
- registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534
repoTags:
- registry.k8s.io/etcd:3.6.5-0
size: "21136588"
- id: sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.3
size: "249461"
- id: sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c
repoDigests:
- docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a
repoTags:
- docker.io/kindest/kindnetd:v20250512-df8de77b
size: "40636774"
- id: sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd
repoDigests:
- registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c
repoTags:
- registry.k8s.io/pause:3.10.1
size: "267939"
- id: sha256:c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13
repoDigests:
- docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b51cd57bdce1589940df856105384ac7f753a1ab43ae
repoTags:
- docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88
size: "38502448"

                                                
                                                
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-032730 image ls --format yaml --alsologtostderr:
I1217 20:17:14.411379  407679 out.go:360] Setting OutFile to fd 1 ...
I1217 20:17:14.411615  407679 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:17:14.411639  407679 out.go:374] Setting ErrFile to fd 2...
I1217 20:17:14.411660  407679 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:17:14.411911  407679 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
I1217 20:17:14.412567  407679 config.go:182] Loaded profile config "functional-032730": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1217 20:17:14.412719  407679 config.go:182] Loaded profile config "functional-032730": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1217 20:17:14.413364  407679 cli_runner.go:164] Run: docker container inspect functional-032730 --format={{.State.Status}}
I1217 20:17:14.436603  407679 ssh_runner.go:195] Run: systemctl --version
I1217 20:17:14.436669  407679 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-032730
I1217 20:17:14.460024  407679 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33158 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-032730/id_rsa Username:docker}
I1217 20:17:14.560800  407679 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListYaml (0.27s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageBuild (4.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctional/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:323: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 ssh pgrep buildkitd
functional_test.go:323: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-032730 ssh pgrep buildkitd: exit status 1 (323.884195ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:330: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 image build -t localhost/my-image:functional-032730 testdata/build --alsologtostderr
functional_test.go:330: (dbg) Done: out/minikube-linux-arm64 -p functional-032730 image build -t localhost/my-image:functional-032730 testdata/build --alsologtostderr: (3.444006831s)
functional_test.go:338: (dbg) Stderr: out/minikube-linux-arm64 -p functional-032730 image build -t localhost/my-image:functional-032730 testdata/build --alsologtostderr:
I1217 20:17:15.256584  407885 out.go:360] Setting OutFile to fd 1 ...
I1217 20:17:15.257343  407885 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:17:15.257360  407885 out.go:374] Setting ErrFile to fd 2...
I1217 20:17:15.257366  407885 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:17:15.257716  407885 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
I1217 20:17:15.258459  407885 config.go:182] Loaded profile config "functional-032730": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1217 20:17:15.260503  407885 config.go:182] Loaded profile config "functional-032730": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1217 20:17:15.261160  407885 cli_runner.go:164] Run: docker container inspect functional-032730 --format={{.State.Status}}
I1217 20:17:15.279648  407885 ssh_runner.go:195] Run: systemctl --version
I1217 20:17:15.279704  407885 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-032730
I1217 20:17:15.297519  407885 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33158 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-032730/id_rsa Username:docker}
I1217 20:17:15.394941  407885 build_images.go:162] Building image from path: /tmp/build.3343213079.tar
I1217 20:17:15.395011  407885 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I1217 20:17:15.403062  407885 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.3343213079.tar
I1217 20:17:15.406702  407885 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.3343213079.tar: stat -c "%s %y" /var/lib/minikube/build/build.3343213079.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.3343213079.tar': No such file or directory
I1217 20:17:15.406737  407885 ssh_runner.go:362] scp /tmp/build.3343213079.tar --> /var/lib/minikube/build/build.3343213079.tar (3072 bytes)
I1217 20:17:15.430805  407885 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.3343213079
I1217 20:17:15.438794  407885 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.3343213079 -xf /var/lib/minikube/build/build.3343213079.tar
I1217 20:17:15.447343  407885 containerd.go:394] Building image: /var/lib/minikube/build/build.3343213079
I1217 20:17:15.447415  407885 ssh_runner.go:195] Run: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.3343213079 --local dockerfile=/var/lib/minikube/build/build.3343213079 --output type=image,name=localhost/my-image:functional-032730
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 97B done
#1 DONE 0.0s

                                                
                                                
#2 [internal] load metadata for gcr.io/k8s-minikube/busybox:latest
#2 DONE 1.5s

                                                
                                                
#3 [internal] load .dockerignore
#3 transferring context: 2B done
#3 DONE 0.0s

                                                
                                                
#4 [internal] load build context
#4 transferring context: 62B done
#4 DONE 0.0s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 resolve gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b 0.0s done
#5 DONE 0.1s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0B / 828.50kB 0.2s
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 828.50kB / 828.50kB 0.4s done
#5 extracting sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0.1s done
#5 DONE 0.6s

                                                
                                                
#6 [2/3] RUN true
#6 DONE 0.6s

                                                
                                                
#7 [3/3] ADD content.txt /
#7 DONE 0.0s

                                                
                                                
#8 exporting to image
#8 exporting layers 0.1s done
#8 exporting manifest sha256:597e02d774b2e55ada50b725ec5660fa9b28eb62c0852cbeddc736734a257156 0.0s done
#8 exporting config sha256:68c668fb5f1a12585f1f63c0530c5ef0e16736c7312b34f905f060919113538d 0.0s done
#8 naming to localhost/my-image:functional-032730 done
#8 DONE 0.2s
I1217 20:17:18.621471  407885 ssh_runner.go:235] Completed: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.3343213079 --local dockerfile=/var/lib/minikube/build/build.3343213079 --output type=image,name=localhost/my-image:functional-032730: (3.174025743s)
I1217 20:17:18.621548  407885 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.3343213079
I1217 20:17:18.630353  407885 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.3343213079.tar
I1217 20:17:18.638121  407885 build_images.go:218] Built localhost/my-image:functional-032730 from /tmp/build.3343213079.tar
I1217 20:17:18.638156  407885 build_images.go:134] succeeded building to: functional-032730
I1217 20:17:18.638161  407885 build_images.go:135] failed building to: 
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageBuild (4.01s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/Setup (0.61s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/Setup
functional_test.go:357: (dbg) Run:  docker pull kicbase/echo-server:1.0
2025/12/17 20:17:08 [DEBUG] GET http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/
functional_test.go:362: (dbg) Run:  docker tag kicbase/echo-server:1.0 kicbase/echo-server:functional-032730
--- PASS: TestFunctional/parallel/ImageCommands/Setup (0.61s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadDaemon (1.33s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:370: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 image load --daemon kicbase/echo-server:functional-032730 --alsologtostderr
functional_test.go:370: (dbg) Done: out/minikube-linux-arm64 -p functional-032730 image load --daemon kicbase/echo-server:functional-032730 --alsologtostderr: (1.056539392s)
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadDaemon (1.33s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageReloadDaemon (1.28s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:380: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 image load --daemon kicbase/echo-server:functional-032730 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageReloadDaemon (1.28s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_changes (0.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_changes
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_changes (0.19s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.2s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.20s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_clusters (0.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_clusters
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_clusters (0.19s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (1.44s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:250: (dbg) Run:  docker pull kicbase/echo-server:latest
functional_test.go:255: (dbg) Run:  docker tag kicbase/echo-server:latest kicbase/echo-server:functional-032730
functional_test.go:260: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 image load --daemon kicbase/echo-server:functional-032730 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (1.44s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.4s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveToFile
functional_test.go:395: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 image save kicbase/echo-server:functional-032730 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.40s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageRemove (0.54s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageRemove
functional_test.go:407: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 image rm kicbase/echo-server:functional-032730 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageRemove (0.54s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadFromFile (0.63s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:424: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadFromFile (0.63s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.38s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:434: (dbg) Run:  docker rmi kicbase/echo-server:functional-032730
functional_test.go:439: (dbg) Run:  out/minikube-linux-arm64 -p functional-032730 image save --daemon kicbase/echo-server:functional-032730 --alsologtostderr
functional_test.go:447: (dbg) Run:  docker image inspect kicbase/echo-server:functional-032730
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.38s)

                                                
                                    
x
+
TestFunctional/delete_echo-server_images (0.05s)

                                                
                                                
=== RUN   TestFunctional/delete_echo-server_images
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:1.0
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:functional-032730
--- PASS: TestFunctional/delete_echo-server_images (0.05s)

                                                
                                    
x
+
TestFunctional/delete_my-image_image (0.02s)

                                                
                                                
=== RUN   TestFunctional/delete_my-image_image
functional_test.go:213: (dbg) Run:  docker rmi -f localhost/my-image:functional-032730
--- PASS: TestFunctional/delete_my-image_image (0.02s)

                                                
                                    
x
+
TestFunctional/delete_minikube_cached_images (0.02s)

                                                
                                                
=== RUN   TestFunctional/delete_minikube_cached_images
functional_test.go:221: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-032730
--- PASS: TestFunctional/delete_minikube_cached_images (0.02s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CopySyncFile
functional_test.go:1860: local sync path: /home/jenkins/minikube-integration/21808-367595/.minikube/files/etc/test/nested/copy/369461/hosts
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/AuditLog
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubeContext (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubeContext
functional_test.go:696: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubeContext (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_remote (3.31s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_remote
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 cache add registry.k8s.io/pause:3.1
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-682596 cache add registry.k8s.io/pause:3.1: (1.165699725s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 cache add registry.k8s.io/pause:3.3
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-682596 cache add registry.k8s.io/pause:3.3: (1.09883235s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 cache add registry.k8s.io/pause:latest
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-682596 cache add registry.k8s.io/pause:latest: (1.049379774s)
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_remote (3.31s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_local (1.05s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_local
functional_test.go:1092: (dbg) Run:  docker build -t minikube-local-cache-test:functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1serialCacheC3019665427/001
functional_test.go:1104: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 cache add minikube-local-cache-test:functional-682596
functional_test.go:1109: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 cache delete minikube-local-cache-test:functional-682596
functional_test.go:1098: (dbg) Run:  docker rmi minikube-local-cache-test:functional-682596
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_local (1.05s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/CacheDelete (0.07s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/CacheDelete
functional_test.go:1117: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/CacheDelete (0.07s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/list (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/list
functional_test.go:1125: (dbg) Run:  out/minikube-linux-arm64 cache list
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/list (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/verify_cache_inside_node (0.28s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1139: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh sudo crictl images
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/verify_cache_inside_node (0.28s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/cache_reload (2.11s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/cache_reload
functional_test.go:1162: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh sudo crictl rmi registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh sudo crictl inspecti registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (294.984736ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1173: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 cache reload
functional_test.go:1173: (dbg) Done: out/minikube-linux-arm64 -p functional-682596 cache reload: (1.184658578s)
functional_test.go:1178: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh sudo crictl inspecti registry.k8s.io/pause:latest
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/cache_reload (2.11s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/delete (0.12s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/delete
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.1
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/delete (0.12s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsCmd (0.94s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsCmd
functional_test.go:1251: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 logs
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsCmd (0.94s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsFileCmd (0.94s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsFileCmd
functional_test.go:1265: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 logs --file /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1serialLogsFi3522299709/001/logs.txt
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsFileCmd (0.94s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ConfigCmd (0.5s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ConfigCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ConfigCmd
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 config get cpus: exit status 14 (90.87827ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 config set cpus 2
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 config get cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 config get cpus: exit status 14 (69.56662ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ConfigCmd (0.50s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DryRun (0.46s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DryRun
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DryRun
functional_test.go:989: (dbg) Run:  out/minikube-linux-arm64 start -p functional-682596 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1
functional_test.go:989: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-682596 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1: exit status 23 (202.385172ms)

                                                
                                                
-- stdout --
	* [functional-682596] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21808
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1217 20:46:58.617760  439064 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:46:58.618058  439064 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:46:58.618072  439064 out.go:374] Setting ErrFile to fd 2...
	I1217 20:46:58.618077  439064 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:46:58.618368  439064 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:46:58.618782  439064 out.go:368] Setting JSON to false
	I1217 20:46:58.619603  439064 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":12564,"bootTime":1765991855,"procs":158,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:46:58.619674  439064 start.go:143] virtualization:  
	I1217 20:46:58.623653  439064 out.go:179] * [functional-682596] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1217 20:46:58.627529  439064 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 20:46:58.627598  439064 notify.go:221] Checking for updates...
	I1217 20:46:58.633460  439064 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:46:58.636527  439064 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:46:58.639514  439064 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:46:58.642399  439064 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 20:46:58.645433  439064 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 20:46:58.648961  439064 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:46:58.649566  439064 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:46:58.678036  439064 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:46:58.678161  439064 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:46:58.750242  439064 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 20:46:58.737836814 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:46:58.750342  439064 docker.go:319] overlay module found
	I1217 20:46:58.753452  439064 out.go:179] * Using the docker driver based on existing profile
	I1217 20:46:58.756506  439064 start.go:309] selected driver: docker
	I1217 20:46:58.756534  439064 start.go:927] validating driver "docker" against &{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUI
D:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:46:58.756660  439064 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 20:46:58.760081  439064 out.go:203] 
	W1217 20:46:58.763016  439064 out.go:285] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I1217 20:46:58.765934  439064 out.go:203] 

                                                
                                                
** /stderr **
functional_test.go:1006: (dbg) Run:  out/minikube-linux-arm64 start -p functional-682596 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DryRun (0.46s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/InternationalLanguage (0.2s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/InternationalLanguage
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/InternationalLanguage
functional_test.go:1035: (dbg) Run:  out/minikube-linux-arm64 start -p functional-682596 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1
functional_test.go:1035: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-682596 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1: exit status 23 (195.279144ms)

                                                
                                                
-- stdout --
	* [functional-682596] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21808
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote docker basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1217 20:46:59.081910  439189 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:46:59.082355  439189 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:46:59.082395  439189 out.go:374] Setting ErrFile to fd 2...
	I1217 20:46:59.082418  439189 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:46:59.082870  439189 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:46:59.083356  439189 out.go:368] Setting JSON to false
	I1217 20:46:59.084244  439189 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":12564,"bootTime":1765991855,"procs":158,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 20:46:59.084377  439189 start.go:143] virtualization:  
	I1217 20:46:59.087733  439189 out.go:179] * [functional-682596] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	I1217 20:46:59.091444  439189 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 20:46:59.091535  439189 notify.go:221] Checking for updates...
	I1217 20:46:59.097262  439189 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 20:46:59.100117  439189 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 20:46:59.102903  439189 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 20:46:59.105734  439189 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 20:46:59.108516  439189 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 20:46:59.111896  439189 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1217 20:46:59.112611  439189 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 20:46:59.134996  439189 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 20:46:59.135121  439189 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:46:59.205094  439189 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-17 20:46:59.195937607 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:46:59.205199  439189 docker.go:319] overlay module found
	I1217 20:46:59.208339  439189 out.go:179] * Utilisation du pilote docker basé sur le profil existant
	I1217 20:46:59.211199  439189 start.go:309] selected driver: docker
	I1217 20:46:59.211235  439189 start.go:927] validating driver "docker" against &{Name:functional-682596 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765661130-22141@sha256:71e28c3ba83563df15de2abc511e112c2c57545086c1b12459c4142b1e28ee78 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-682596 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUI
D:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1217 20:46:59.211332  439189 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 20:46:59.214852  439189 out.go:203] 
	W1217 20:46:59.217732  439189 out.go:285] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I1217 20:46:59.220709  439189 out.go:203] 

                                                
                                                
** /stderr **
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/InternationalLanguage (0.20s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/AddonsCmd (0.14s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/AddonsCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/AddonsCmd
functional_test.go:1695: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 addons list
functional_test.go:1707: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 addons list -o json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/AddonsCmd (0.14s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/SSHCmd (0.74s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/SSHCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/SSHCmd
functional_test.go:1730: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "echo hello"
functional_test.go:1747: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "cat /etc/hostname"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/SSHCmd (0.74s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CpCmd (2.04s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CpCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CpCmd
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh -n functional-682596 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 cp functional-682596:/home/docker/cp-test.txt /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelCpCm1480265787/001/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh -n functional-682596 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh -n functional-682596 "sudo cat /tmp/does/not/exist/cp-test.txt"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CpCmd (2.04s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/FileSync (0.34s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/FileSync
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/FileSync
functional_test.go:1934: Checking for existence of /etc/test/nested/copy/369461/hosts within VM
functional_test.go:1936: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "sudo cat /etc/test/nested/copy/369461/hosts"
functional_test.go:1941: file sync test content: Test file for checking file sync process
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/FileSync (0.34s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CertSync (2.09s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CertSync
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CertSync
functional_test.go:1977: Checking for existence of /etc/ssl/certs/369461.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "sudo cat /etc/ssl/certs/369461.pem"
functional_test.go:1977: Checking for existence of /usr/share/ca-certificates/369461.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "sudo cat /usr/share/ca-certificates/369461.pem"
functional_test.go:1977: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/3694612.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "sudo cat /etc/ssl/certs/3694612.pem"
functional_test.go:2004: Checking for existence of /usr/share/ca-certificates/3694612.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "sudo cat /usr/share/ca-certificates/3694612.pem"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CertSync (2.09s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NonActiveRuntimeDisabled (0.71s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NonActiveRuntimeDisabled
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "sudo systemctl is-active docker"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 ssh "sudo systemctl is-active docker": exit status 1 (383.034514ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "sudo systemctl is-active crio"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 ssh "sudo systemctl is-active crio": exit status 1 (326.727632ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NonActiveRuntimeDisabled (0.71s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/License (0.33s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/License
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/License
functional_test.go:2293: (dbg) Run:  out/minikube-linux-arm64 license
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/License (0.33s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/short (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/short
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/short
functional_test.go:2261: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 version --short
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/short (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/components (0.5s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/components
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/components
functional_test.go:2275: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 version -o=json --components
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/components (0.50s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListShort (0.22s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListShort
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 image ls --format short --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-682596 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.10.1
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.35.0-rc.1
registry.k8s.io/kube-proxy:v1.35.0-rc.1
registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
registry.k8s.io/kube-apiserver:v1.35.0-rc.1
registry.k8s.io/etcd:3.6.6-0
registry.k8s.io/coredns/coredns:v1.13.1
gcr.io/k8s-minikube/storage-provisioner:v5
docker.io/library/minikube-local-cache-test:functional-682596
docker.io/kindest/kindnetd:v20250512-df8de77b
docker.io/kicbase/echo-server:functional-682596
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-682596 image ls --format short --alsologtostderr:
I1217 20:47:02.216815  439832 out.go:360] Setting OutFile to fd 1 ...
I1217 20:47:02.217025  439832 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:47:02.217055  439832 out.go:374] Setting ErrFile to fd 2...
I1217 20:47:02.217075  439832 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:47:02.217356  439832 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
I1217 20:47:02.218045  439832 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1217 20:47:02.218215  439832 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1217 20:47:02.218774  439832 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
I1217 20:47:02.236184  439832 ssh_runner.go:195] Run: systemctl --version
I1217 20:47:02.236233  439832 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
I1217 20:47:02.258776  439832 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
I1217 20:47:02.355043  439832 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListShort (0.22s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListTable (0.22s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListTable
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 image ls --format table --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-682596 image ls --format table --alsologtostderr:
┌─────────────────────────────────────────────┬────────────────────┬───────────────┬────────┐
│                    IMAGE                    │        TAG         │   IMAGE ID    │  SIZE  │
├─────────────────────────────────────────────┼────────────────────┼───────────────┼────────┤
│ docker.io/kicbase/echo-server               │ functional-682596  │ sha256:ce2d2c │ 2.17MB │
│ registry.k8s.io/coredns/coredns             │ v1.13.1            │ sha256:e08f4d │ 21.2MB │
│ registry.k8s.io/etcd                        │ 3.6.6-0            │ sha256:271e49 │ 21.7MB │
│ registry.k8s.io/pause                       │ 3.1                │ sha256:8057e0 │ 262kB  │
│ registry.k8s.io/kube-proxy                  │ v1.35.0-rc.1       │ sha256:7e3ace │ 22.4MB │
│ registry.k8s.io/pause                       │ 3.3                │ sha256:3d1873 │ 249kB  │
│ registry.k8s.io/pause                       │ latest             │ sha256:8cb209 │ 71.3kB │
│ docker.io/kindest/kindnetd                  │ v20250512-df8de77b │ sha256:b1a8c6 │ 40.6MB │
│ localhost/my-image                          │ functional-682596  │ sha256:212d7a │ 831kB  │
│ registry.k8s.io/kube-controller-manager     │ v1.35.0-rc.1       │ sha256:a34b34 │ 20.7MB │
│ registry.k8s.io/pause                       │ 3.10.1             │ sha256:d7b100 │ 268kB  │
│ docker.io/library/minikube-local-cache-test │ functional-682596  │ sha256:05258a │ 992B   │
│ gcr.io/k8s-minikube/storage-provisioner     │ v5                 │ sha256:ba04bb │ 8.03MB │
│ registry.k8s.io/kube-apiserver              │ v1.35.0-rc.1       │ sha256:3c6ba2 │ 24.7MB │
│ registry.k8s.io/kube-scheduler              │ v1.35.0-rc.1       │ sha256:abca4d │ 15.4MB │
└─────────────────────────────────────────────┴────────────────────┴───────────────┴────────┘
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-682596 image ls --format table --alsologtostderr:
I1217 20:47:06.559078  440232 out.go:360] Setting OutFile to fd 1 ...
I1217 20:47:06.559189  440232 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:47:06.559200  440232 out.go:374] Setting ErrFile to fd 2...
I1217 20:47:06.559204  440232 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:47:06.559465  440232 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
I1217 20:47:06.560064  440232 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1217 20:47:06.560188  440232 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1217 20:47:06.560745  440232 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
I1217 20:47:06.577913  440232 ssh_runner.go:195] Run: systemctl --version
I1217 20:47:06.577979  440232 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
I1217 20:47:06.598348  440232 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
I1217 20:47:06.694815  440232 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListTable (0.22s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListJson (0.23s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListJson
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 image ls --format json --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-682596 image ls --format json --alsologtostderr:
[{"id":"sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e","repoDigests":["registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"],"repoTags":["registry.k8s.io/kube-proxy:v1.35.0-rc.1"],"size":"22432301"},{"id":"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.1"],"size":"262191"},{"id":"sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c","repoDigests":["docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"],"repoTags":["docker.io/kindest/kindnetd:v20250512-df8de77b"],"size":"40636774"},{"id":"sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6","repoDigests":["gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"8034419"},{"id":"sha256:e08f4d9d2e6e
de8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf","repoDigests":["registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"],"repoTags":["registry.k8s.io/coredns/coredns:v1.13.1"],"size":"21168808"},{"id":"sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54","repoDigests":["registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"],"repoTags":["registry.k8s.io/kube-apiserver:v1.35.0-rc.1"],"size":"24692223"},{"id":"sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde","repoDigests":["registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"],"repoTags":["registry.k8s.io/kube-scheduler:v1.35.0-rc.1"],"size":"15405535"},{"id":"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd","repoDigests":["registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"],"repoTags":["registry.k8
s.io/pause:3.10.1"],"size":"267939"},{"id":"sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.3"],"size":"249461"},{"id":"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a","repoDigests":[],"repoTags":["registry.k8s.io/pause:latest"],"size":"71300"},{"id":"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57","repoDigests":["registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"],"repoTags":["registry.k8s.io/etcd:3.6.6-0"],"size":"21749640"},{"id":"sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a","repoDigests":["registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"],"repoTags":["registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"],"size":"20672157"},{"id":"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17","repoDigests":[],"repoTags":["docker.io/kicb
ase/echo-server:functional-682596"],"size":"2173567"},{"id":"sha256:05258a74f07dd17944d5b57da11e1219f05ceba6a54a10e2544b7da8ff43103b","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-682596"],"size":"992"},{"id":"sha256:212d7ac8fb5b3349513f07673a14f9641bff360bca9ba48d3b318caf7f938aad","repoDigests":[],"repoTags":["localhost/my-image:functional-682596"],"size":"830617"}]
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-682596 image ls --format json --alsologtostderr:
I1217 20:47:06.337205  440198 out.go:360] Setting OutFile to fd 1 ...
I1217 20:47:06.337317  440198 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:47:06.337328  440198 out.go:374] Setting ErrFile to fd 2...
I1217 20:47:06.337333  440198 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:47:06.337598  440198 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
I1217 20:47:06.338205  440198 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1217 20:47:06.338334  440198 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1217 20:47:06.338848  440198 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
I1217 20:47:06.356697  440198 ssh_runner.go:195] Run: systemctl --version
I1217 20:47:06.356752  440198 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
I1217 20:47:06.378431  440198 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
I1217 20:47:06.474779  440198 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListJson (0.23s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListYaml (0.23s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListYaml
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 image ls --format yaml --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-682596 image ls --format yaml --alsologtostderr:
- id: sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde
repoDigests:
- registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3
repoTags:
- registry.k8s.io/kube-scheduler:v1.35.0-rc.1
size: "15405535"
- id: sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd
repoDigests:
- registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c
repoTags:
- registry.k8s.io/pause:3.10.1
size: "267939"
- id: sha256:05258a74f07dd17944d5b57da11e1219f05ceba6a54a10e2544b7da8ff43103b
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-682596
size: "992"
- id: sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf
repoDigests:
- registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6
repoTags:
- registry.k8s.io/coredns/coredns:v1.13.1
size: "21168808"
- id: sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54
repoDigests:
- registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee
repoTags:
- registry.k8s.io/kube-apiserver:v1.35.0-rc.1
size: "24692223"
- id: sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a
repoDigests:
- registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98
repoTags:
- registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
size: "20672157"
- id: sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.1
size: "262191"
- id: sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.3
size: "249461"
- id: sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a
repoDigests: []
repoTags:
- registry.k8s.io/pause:latest
size: "71300"
- id: sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17
repoDigests: []
repoTags:
- docker.io/kicbase/echo-server:functional-682596
size: "2173567"
- id: sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c
repoDigests:
- docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a
repoTags:
- docker.io/kindest/kindnetd:v20250512-df8de77b
size: "40636774"
- id: sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6
repoDigests:
- gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "8034419"
- id: sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57
repoDigests:
- registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890
repoTags:
- registry.k8s.io/etcd:3.6.6-0
size: "21749640"
- id: sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e
repoDigests:
- registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5
repoTags:
- registry.k8s.io/kube-proxy:v1.35.0-rc.1
size: "22432301"

                                                
                                                
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-682596 image ls --format yaml --alsologtostderr:
I1217 20:47:02.438031  439870 out.go:360] Setting OutFile to fd 1 ...
I1217 20:47:02.438224  439870 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:47:02.438253  439870 out.go:374] Setting ErrFile to fd 2...
I1217 20:47:02.438275  439870 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:47:02.438659  439870 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
I1217 20:47:02.439757  439870 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1217 20:47:02.439947  439870 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1217 20:47:02.440603  439870 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
I1217 20:47:02.459066  439870 ssh_runner.go:195] Run: systemctl --version
I1217 20:47:02.459122  439870 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
I1217 20:47:02.477074  439870 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
I1217 20:47:02.583204  439870 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListYaml (0.23s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageBuild (3.67s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageBuild
functional_test.go:323: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh pgrep buildkitd
functional_test.go:323: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 ssh pgrep buildkitd: exit status 1 (253.05818ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:330: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 image build -t localhost/my-image:functional-682596 testdata/build --alsologtostderr
functional_test.go:330: (dbg) Done: out/minikube-linux-arm64 -p functional-682596 image build -t localhost/my-image:functional-682596 testdata/build --alsologtostderr: (3.187136372s)
functional_test.go:338: (dbg) Stderr: out/minikube-linux-arm64 -p functional-682596 image build -t localhost/my-image:functional-682596 testdata/build --alsologtostderr:
I1217 20:47:02.916768  439977 out.go:360] Setting OutFile to fd 1 ...
I1217 20:47:02.916964  439977 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:47:02.916991  439977 out.go:374] Setting ErrFile to fd 2...
I1217 20:47:02.917010  439977 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1217 20:47:02.917277  439977 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
I1217 20:47:02.917955  439977 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1217 20:47:02.918763  439977 config.go:182] Loaded profile config "functional-682596": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1217 20:47:02.919315  439977 cli_runner.go:164] Run: docker container inspect functional-682596 --format={{.State.Status}}
I1217 20:47:02.936766  439977 ssh_runner.go:195] Run: systemctl --version
I1217 20:47:02.936825  439977 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-682596
I1217 20:47:02.954672  439977 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33163 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/functional-682596/id_rsa Username:docker}
I1217 20:47:03.055185  439977 build_images.go:162] Building image from path: /tmp/build.1165864762.tar
I1217 20:47:03.055255  439977 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I1217 20:47:03.063348  439977 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.1165864762.tar
I1217 20:47:03.067154  439977 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.1165864762.tar: stat -c "%s %y" /var/lib/minikube/build/build.1165864762.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.1165864762.tar': No such file or directory
I1217 20:47:03.067186  439977 ssh_runner.go:362] scp /tmp/build.1165864762.tar --> /var/lib/minikube/build/build.1165864762.tar (3072 bytes)
I1217 20:47:03.085442  439977 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.1165864762
I1217 20:47:03.093464  439977 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.1165864762 -xf /var/lib/minikube/build/build.1165864762.tar
I1217 20:47:03.102810  439977 containerd.go:394] Building image: /var/lib/minikube/build/build.1165864762
I1217 20:47:03.102877  439977 ssh_runner.go:195] Run: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.1165864762 --local dockerfile=/var/lib/minikube/build/build.1165864762 --output type=image,name=localhost/my-image:functional-682596
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 97B done
#1 DONE 0.0s

                                                
                                                
#2 [internal] load metadata for gcr.io/k8s-minikube/busybox:latest
#2 DONE 1.7s

                                                
                                                
#3 [internal] load .dockerignore
#3 transferring context: 2B done
#3 DONE 0.0s

                                                
                                                
#4 [internal] load build context
#4 transferring context: 62B done
#4 DONE 0.1s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 resolve gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b 0.0s done
#5 DONE 0.1s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0B / 828.50kB 0.2s
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 828.50kB / 828.50kB 0.4s done
#5 extracting sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0.1s done
#5 DONE 0.6s

                                                
                                                
#6 [2/3] RUN true
#6 DONE 0.1s

                                                
                                                
#7 [3/3] ADD content.txt /
#7 DONE 0.0s

                                                
                                                
#8 exporting to image
#8 exporting layers 0.1s done
#8 exporting manifest sha256:5fe49719d6ac7d0b04517923e4bb54e88fb4332cf53c01525a9b5365007b2788
#8 exporting manifest sha256:5fe49719d6ac7d0b04517923e4bb54e88fb4332cf53c01525a9b5365007b2788 0.0s done
#8 exporting config sha256:212d7ac8fb5b3349513f07673a14f9641bff360bca9ba48d3b318caf7f938aad 0.0s done
#8 naming to localhost/my-image:functional-682596 done
#8 DONE 0.2s
I1217 20:47:06.030339  439977 ssh_runner.go:235] Completed: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.1165864762 --local dockerfile=/var/lib/minikube/build/build.1165864762 --output type=image,name=localhost/my-image:functional-682596: (2.927435093s)
I1217 20:47:06.030423  439977 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.1165864762
I1217 20:47:06.038537  439977 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.1165864762.tar
I1217 20:47:06.046616  439977 build_images.go:218] Built localhost/my-image:functional-682596 from /tmp/build.1165864762.tar
I1217 20:47:06.046647  439977 build_images.go:134] succeeded building to: functional-682596
I1217 20:47:06.046653  439977 build_images.go:135] failed building to: 
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageBuild (3.67s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/Setup (0.28s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/Setup
functional_test.go:357: (dbg) Run:  docker pull kicbase/echo-server:1.0
functional_test.go:362: (dbg) Run:  docker tag kicbase/echo-server:1.0 kicbase/echo-server:functional-682596
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/Setup (0.28s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadDaemon (1.46s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:370: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 image load --daemon kicbase/echo-server:functional-682596 --alsologtostderr
functional_test.go:370: (dbg) Done: out/minikube-linux-arm64 -p functional-682596 image load --daemon kicbase/echo-server:functional-682596 --alsologtostderr: (1.119811568s)
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadDaemon (1.46s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageReloadDaemon (1.4s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:380: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 image load --daemon kicbase/echo-server:functional-682596 --alsologtostderr
functional_test.go:380: (dbg) Done: out/minikube-linux-arm64 -p functional-682596 image load --daemon kicbase/echo-server:functional-682596 --alsologtostderr: (1.072368295s)
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageReloadDaemon (1.40s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageTagAndLoadDaemon (1.61s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:250: (dbg) Run:  docker pull kicbase/echo-server:latest
functional_test.go:255: (dbg) Run:  docker tag kicbase/echo-server:latest kicbase/echo-server:functional-682596
functional_test.go:260: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 image load --daemon kicbase/echo-server:functional-682596 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageTagAndLoadDaemon (1.61s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_changes (0.16s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_changes
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 update-context --alsologtostderr -v=2
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_changes (0.16s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_minikube_cluster (0.15s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 update-context --alsologtostderr -v=2
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_minikube_cluster (0.15s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_clusters (0.15s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_clusters
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 update-context --alsologtostderr -v=2
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_clusters (0.15s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveToFile (0.42s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveToFile
functional_test.go:395: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 image save kicbase/echo-server:functional-682596 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveToFile (0.42s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageRemove (0.58s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageRemove
functional_test.go:407: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 image rm kicbase/echo-server:functional-682596 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageRemove (0.58s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadFromFile (0.83s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:424: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadFromFile (0.83s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveDaemon (0.51s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:434: (dbg) Run:  docker rmi kicbase/echo-server:functional-682596
functional_test.go:439: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 image save --daemon kicbase/echo-server:functional-682596 --alsologtostderr
functional_test.go:447: (dbg) Run:  docker image inspect kicbase/echo-server:functional-682596
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveDaemon (0.51s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/StartTunnel (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:129: (dbg) daemon: [out/minikube-linux-arm64 -p functional-682596 tunnel --alsologtostderr]
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/StartTunnel (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DeleteTunnel (0.1s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:434: (dbg) stopping [out/minikube-linux-arm64 -p functional-682596 tunnel --alsologtostderr] ...
functional_test_tunnel_test.go:437: failed to stop process: exit status 103
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DeleteTunnel (0.10s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_not_create (0.39s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_not_create
functional_test.go:1285: (dbg) Run:  out/minikube-linux-arm64 profile lis
functional_test.go:1290: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_not_create (0.39s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_list (0.39s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_list
functional_test.go:1325: (dbg) Run:  out/minikube-linux-arm64 profile list
functional_test.go:1330: Took "340.79124ms" to run "out/minikube-linux-arm64 profile list"
functional_test.go:1339: (dbg) Run:  out/minikube-linux-arm64 profile list -l
functional_test.go:1344: Took "52.770664ms" to run "out/minikube-linux-arm64 profile list -l"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_list (0.39s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_json_output (0.38s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_json_output
functional_test.go:1376: (dbg) Run:  out/minikube-linux-arm64 profile list -o json
functional_test.go:1381: Took "326.181369ms" to run "out/minikube-linux-arm64 profile list -o json"
functional_test.go:1389: (dbg) Run:  out/minikube-linux-arm64 profile list -o json --light
functional_test.go:1394: Took "52.885915ms" to run "out/minikube-linux-arm64 profile list -o json --light"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_json_output (0.38s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/specific-port (1.72s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/specific-port
functional_test_mount_test.go:213: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun793336896/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (336.166173ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1217 20:46:55.363309  369461 retry.go:31] will retry after 347.148015ms: exit status 1
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:257: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh -- ls -la /mount-9p
functional_test_mount_test.go:261: guest mount directory contents
total 0
functional_test_mount_test.go:263: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun793336896/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:264: reading mount text
functional_test_mount_test.go:278: done reading mount text
functional_test_mount_test.go:230: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:230: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 ssh "sudo umount -f /mount-9p": exit status 1 (275.667694ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:232: "out/minikube-linux-arm64 -p functional-682596 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:234: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun793336896/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/specific-port (1.72s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/VerifyCleanup (1.82s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2013412110/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2013412110/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2013412110/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-682596 ssh "findmnt -T" /mount1: exit status 1 (584.773888ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1217 20:46:57.339434  369461 retry.go:31] will retry after 333.906754ms: exit status 1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "findmnt -T" /mount2
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-682596 ssh "findmnt -T" /mount3
functional_test_mount_test.go:370: (dbg) Run:  out/minikube-linux-arm64 mount -p functional-682596 --kill=true
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2013412110/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2013412110/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-682596 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2013412110/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/VerifyCleanup (1.82s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_echo-server_images (0.04s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_echo-server_images
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:1.0
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:functional-682596
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_echo-server_images (0.04s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_my-image_image (0.02s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_my-image_image
functional_test.go:213: (dbg) Run:  docker rmi -f localhost/my-image:functional-682596
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_my-image_image (0.02s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_minikube_cached_images (0.02s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_minikube_cached_images
functional_test.go:221: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-682596
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_minikube_cached_images (0.02s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StartCluster (155.03s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StartCluster
ha_test.go:101: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 start --ha --memory 3072 --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd
E1217 20:49:36.458556  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:49:36.464922  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:49:36.476302  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:49:36.497738  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:49:36.539082  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:49:36.620560  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:49:36.782099  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:49:37.103765  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:49:37.745532  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:49:39.027389  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:49:41.588696  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:49:46.710008  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:49:49.015120  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:49:56.952061  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:50:17.434281  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:50:58.396133  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:101: (dbg) Done: out/minikube-linux-arm64 -p ha-286241 start --ha --memory 3072 --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd: (2m34.190833308s)
ha_test.go:107: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 status --alsologtostderr -v 5
--- PASS: TestMultiControlPlane/serial/StartCluster (155.03s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DeployApp (7.68s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DeployApp
ha_test.go:128: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- apply -f ./testdata/ha/ha-pod-dns-test.yaml
ha_test.go:133: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- rollout status deployment/busybox
ha_test.go:133: (dbg) Done: out/minikube-linux-arm64 -p ha-286241 kubectl -- rollout status deployment/busybox: (4.771364816s)
ha_test.go:140: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:163: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:171: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- exec busybox-7b57f96db7-5drxc -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- exec busybox-7b57f96db7-5tnhc -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- exec busybox-7b57f96db7-ztx9r -- nslookup kubernetes.io
ha_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- exec busybox-7b57f96db7-5drxc -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- exec busybox-7b57f96db7-5tnhc -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- exec busybox-7b57f96db7-ztx9r -- nslookup kubernetes.default
ha_test.go:189: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- exec busybox-7b57f96db7-5drxc -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- exec busybox-7b57f96db7-5tnhc -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- exec busybox-7b57f96db7-ztx9r -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiControlPlane/serial/DeployApp (7.68s)

                                                
                                    
x
+
TestMultiControlPlane/serial/PingHostFromPods (1.56s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/PingHostFromPods
ha_test.go:199: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:207: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- exec busybox-7b57f96db7-5drxc -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- exec busybox-7b57f96db7-5drxc -- sh -c "ping -c 1 192.168.49.1"
ha_test.go:207: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- exec busybox-7b57f96db7-5tnhc -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- exec busybox-7b57f96db7-5tnhc -- sh -c "ping -c 1 192.168.49.1"
ha_test.go:207: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- exec busybox-7b57f96db7-ztx9r -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 kubectl -- exec busybox-7b57f96db7-ztx9r -- sh -c "ping -c 1 192.168.49.1"
--- PASS: TestMultiControlPlane/serial/PingHostFromPods (1.56s)

                                                
                                    
x
+
TestMultiControlPlane/serial/AddWorkerNode (32.32s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/AddWorkerNode
ha_test.go:228: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 node add --alsologtostderr -v 5
E1217 20:51:28.507234  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:228: (dbg) Done: out/minikube-linux-arm64 -p ha-286241 node add --alsologtostderr -v 5: (31.269064086s)
ha_test.go:234: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 status --alsologtostderr -v 5
ha_test.go:234: (dbg) Done: out/minikube-linux-arm64 -p ha-286241 status --alsologtostderr -v 5: (1.049257154s)
--- PASS: TestMultiControlPlane/serial/AddWorkerNode (32.32s)

                                                
                                    
x
+
TestMultiControlPlane/serial/NodeLabels (0.11s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/NodeLabels
ha_test.go:255: (dbg) Run:  kubectl --context ha-286241 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiControlPlane/serial/NodeLabels (0.11s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterClusterStart (1.06s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterClusterStart
ha_test.go:281: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
ha_test.go:281: (dbg) Done: out/minikube-linux-arm64 profile list --output json: (1.055847943s)
--- PASS: TestMultiControlPlane/serial/HAppyAfterClusterStart (1.06s)

                                                
                                    
x
+
TestMultiControlPlane/serial/CopyFile (20.31s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/CopyFile
ha_test.go:328: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 status --output json --alsologtostderr -v 5
ha_test.go:328: (dbg) Done: out/minikube-linux-arm64 -p ha-286241 status --output json --alsologtostderr -v 5: (1.40457322s)
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp testdata/cp-test.txt ha-286241:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp ha-286241:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile242466247/001/cp-test_ha-286241.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp ha-286241:/home/docker/cp-test.txt ha-286241-m02:/home/docker/cp-test_ha-286241_ha-286241-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m02 "sudo cat /home/docker/cp-test_ha-286241_ha-286241-m02.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp ha-286241:/home/docker/cp-test.txt ha-286241-m03:/home/docker/cp-test_ha-286241_ha-286241-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m03 "sudo cat /home/docker/cp-test_ha-286241_ha-286241-m03.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp ha-286241:/home/docker/cp-test.txt ha-286241-m04:/home/docker/cp-test_ha-286241_ha-286241-m04.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m04 "sudo cat /home/docker/cp-test_ha-286241_ha-286241-m04.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp testdata/cp-test.txt ha-286241-m02:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp ha-286241-m02:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile242466247/001/cp-test_ha-286241-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp ha-286241-m02:/home/docker/cp-test.txt ha-286241:/home/docker/cp-test_ha-286241-m02_ha-286241.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241 "sudo cat /home/docker/cp-test_ha-286241-m02_ha-286241.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp ha-286241-m02:/home/docker/cp-test.txt ha-286241-m03:/home/docker/cp-test_ha-286241-m02_ha-286241-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m03 "sudo cat /home/docker/cp-test_ha-286241-m02_ha-286241-m03.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp ha-286241-m02:/home/docker/cp-test.txt ha-286241-m04:/home/docker/cp-test_ha-286241-m02_ha-286241-m04.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m04 "sudo cat /home/docker/cp-test_ha-286241-m02_ha-286241-m04.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp testdata/cp-test.txt ha-286241-m03:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp ha-286241-m03:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile242466247/001/cp-test_ha-286241-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp ha-286241-m03:/home/docker/cp-test.txt ha-286241:/home/docker/cp-test_ha-286241-m03_ha-286241.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241 "sudo cat /home/docker/cp-test_ha-286241-m03_ha-286241.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp ha-286241-m03:/home/docker/cp-test.txt ha-286241-m02:/home/docker/cp-test_ha-286241-m03_ha-286241-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m02 "sudo cat /home/docker/cp-test_ha-286241-m03_ha-286241-m02.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp ha-286241-m03:/home/docker/cp-test.txt ha-286241-m04:/home/docker/cp-test_ha-286241-m03_ha-286241-m04.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m04 "sudo cat /home/docker/cp-test_ha-286241-m03_ha-286241-m04.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp testdata/cp-test.txt ha-286241-m04:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp ha-286241-m04:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile242466247/001/cp-test_ha-286241-m04.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp ha-286241-m04:/home/docker/cp-test.txt ha-286241:/home/docker/cp-test_ha-286241-m04_ha-286241.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241 "sudo cat /home/docker/cp-test_ha-286241-m04_ha-286241.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp ha-286241-m04:/home/docker/cp-test.txt ha-286241-m02:/home/docker/cp-test_ha-286241-m04_ha-286241-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m02 "sudo cat /home/docker/cp-test_ha-286241-m04_ha-286241-m02.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 cp ha-286241-m04:/home/docker/cp-test.txt ha-286241-m03:/home/docker/cp-test_ha-286241-m04_ha-286241-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 ssh -n ha-286241-m03 "sudo cat /home/docker/cp-test_ha-286241-m04_ha-286241-m03.txt"
--- PASS: TestMultiControlPlane/serial/CopyFile (20.31s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StopSecondaryNode (13.02s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StopSecondaryNode
ha_test.go:365: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 node stop m02 --alsologtostderr -v 5
E1217 20:52:20.318039  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:365: (dbg) Done: out/minikube-linux-arm64 -p ha-286241 node stop m02 --alsologtostderr -v 5: (12.254170312s)
ha_test.go:371: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 status --alsologtostderr -v 5
ha_test.go:371: (dbg) Non-zero exit: out/minikube-linux-arm64 -p ha-286241 status --alsologtostderr -v 5: exit status 7 (764.429692ms)

                                                
                                                
-- stdout --
	ha-286241
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-286241-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-286241-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-286241-m04
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1217 20:52:32.587972  457664 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:52:32.588143  457664 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:52:32.588173  457664 out.go:374] Setting ErrFile to fd 2...
	I1217 20:52:32.588192  457664 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:52:32.588500  457664 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:52:32.588745  457664 out.go:368] Setting JSON to false
	I1217 20:52:32.588798  457664 mustload.go:66] Loading cluster: ha-286241
	I1217 20:52:32.588869  457664 notify.go:221] Checking for updates...
	I1217 20:52:32.590315  457664 config.go:182] Loaded profile config "ha-286241": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1217 20:52:32.590371  457664 status.go:174] checking status of ha-286241 ...
	I1217 20:52:32.591675  457664 cli_runner.go:164] Run: docker container inspect ha-286241 --format={{.State.Status}}
	I1217 20:52:32.612760  457664 status.go:371] ha-286241 host status = "Running" (err=<nil>)
	I1217 20:52:32.612788  457664 host.go:66] Checking if "ha-286241" exists ...
	I1217 20:52:32.613093  457664 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" ha-286241
	I1217 20:52:32.633461  457664 host.go:66] Checking if "ha-286241" exists ...
	I1217 20:52:32.633885  457664 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1217 20:52:32.633928  457664 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" ha-286241
	I1217 20:52:32.653357  457664 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33168 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/ha-286241/id_rsa Username:docker}
	I1217 20:52:32.754383  457664 ssh_runner.go:195] Run: systemctl --version
	I1217 20:52:32.761038  457664 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1217 20:52:32.775086  457664 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 20:52:32.841872  457664 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:4 ContainersRunning:3 ContainersPaused:0 ContainersStopped:1 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:68 OomKillDisable:true NGoroutines:72 SystemTime:2025-12-17 20:52:32.831816262 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 20:52:32.842468  457664 kubeconfig.go:125] found "ha-286241" server: "https://192.168.49.254:8443"
	I1217 20:52:32.842511  457664 api_server.go:166] Checking apiserver status ...
	I1217 20:52:32.842561  457664 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:52:32.856157  457664 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1459/cgroup
	I1217 20:52:32.868068  457664 api_server.go:182] apiserver freezer: "6:freezer:/docker/2a41b52a9757d7a0b2a90a8c1ceed5c60bf0aedab6b06ede69dd9bd09a395ee4/kubepods/burstable/pod4606fe9a77180cc29a3894343a7dece1/793070fd2bd3f0e9a1878a17dbf3a132666c7334d3a6637a02cd209258f2306b"
	I1217 20:52:32.868162  457664 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/2a41b52a9757d7a0b2a90a8c1ceed5c60bf0aedab6b06ede69dd9bd09a395ee4/kubepods/burstable/pod4606fe9a77180cc29a3894343a7dece1/793070fd2bd3f0e9a1878a17dbf3a132666c7334d3a6637a02cd209258f2306b/freezer.state
	I1217 20:52:32.876023  457664 api_server.go:204] freezer state: "THAWED"
	I1217 20:52:32.876061  457664 api_server.go:253] Checking apiserver healthz at https://192.168.49.254:8443/healthz ...
	I1217 20:52:32.884592  457664 api_server.go:279] https://192.168.49.254:8443/healthz returned 200:
	ok
	I1217 20:52:32.884632  457664 status.go:463] ha-286241 apiserver status = Running (err=<nil>)
	I1217 20:52:32.884648  457664 status.go:176] ha-286241 status: &{Name:ha-286241 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1217 20:52:32.884675  457664 status.go:174] checking status of ha-286241-m02 ...
	I1217 20:52:32.885017  457664 cli_runner.go:164] Run: docker container inspect ha-286241-m02 --format={{.State.Status}}
	I1217 20:52:32.904544  457664 status.go:371] ha-286241-m02 host status = "Stopped" (err=<nil>)
	I1217 20:52:32.904570  457664 status.go:384] host is not running, skipping remaining checks
	I1217 20:52:32.904578  457664 status.go:176] ha-286241-m02 status: &{Name:ha-286241-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1217 20:52:32.904605  457664 status.go:174] checking status of ha-286241-m03 ...
	I1217 20:52:32.904936  457664 cli_runner.go:164] Run: docker container inspect ha-286241-m03 --format={{.State.Status}}
	I1217 20:52:32.923212  457664 status.go:371] ha-286241-m03 host status = "Running" (err=<nil>)
	I1217 20:52:32.923234  457664 host.go:66] Checking if "ha-286241-m03" exists ...
	I1217 20:52:32.923544  457664 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" ha-286241-m03
	I1217 20:52:32.941239  457664 host.go:66] Checking if "ha-286241-m03" exists ...
	I1217 20:52:32.941556  457664 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1217 20:52:32.941675  457664 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" ha-286241-m03
	I1217 20:52:32.958914  457664 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33178 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/ha-286241-m03/id_rsa Username:docker}
	I1217 20:52:33.053779  457664 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1217 20:52:33.073868  457664 kubeconfig.go:125] found "ha-286241" server: "https://192.168.49.254:8443"
	I1217 20:52:33.073907  457664 api_server.go:166] Checking apiserver status ...
	I1217 20:52:33.073967  457664 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 20:52:33.087248  457664 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1420/cgroup
	I1217 20:52:33.095825  457664 api_server.go:182] apiserver freezer: "6:freezer:/docker/c7ade889994f60c324c40e798bd8abf7adc5aac83d2676e069f9a1da4cab11b0/kubepods/burstable/pod23ac2b5a6d5fdd46e1908d42a7ed8208/a9aeaf70fc0161604938a4e09f83e14e20ab299e056cd3f5eeeeb8ab2a3b7236"
	I1217 20:52:33.095895  457664 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/c7ade889994f60c324c40e798bd8abf7adc5aac83d2676e069f9a1da4cab11b0/kubepods/burstable/pod23ac2b5a6d5fdd46e1908d42a7ed8208/a9aeaf70fc0161604938a4e09f83e14e20ab299e056cd3f5eeeeb8ab2a3b7236/freezer.state
	I1217 20:52:33.104200  457664 api_server.go:204] freezer state: "THAWED"
	I1217 20:52:33.104360  457664 api_server.go:253] Checking apiserver healthz at https://192.168.49.254:8443/healthz ...
	I1217 20:52:33.112705  457664 api_server.go:279] https://192.168.49.254:8443/healthz returned 200:
	ok
	I1217 20:52:33.112734  457664 status.go:463] ha-286241-m03 apiserver status = Running (err=<nil>)
	I1217 20:52:33.112743  457664 status.go:176] ha-286241-m03 status: &{Name:ha-286241-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1217 20:52:33.112787  457664 status.go:174] checking status of ha-286241-m04 ...
	I1217 20:52:33.113107  457664 cli_runner.go:164] Run: docker container inspect ha-286241-m04 --format={{.State.Status}}
	I1217 20:52:33.136488  457664 status.go:371] ha-286241-m04 host status = "Running" (err=<nil>)
	I1217 20:52:33.136513  457664 host.go:66] Checking if "ha-286241-m04" exists ...
	I1217 20:52:33.136832  457664 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" ha-286241-m04
	I1217 20:52:33.154337  457664 host.go:66] Checking if "ha-286241-m04" exists ...
	I1217 20:52:33.154649  457664 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1217 20:52:33.154752  457664 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" ha-286241-m04
	I1217 20:52:33.180681  457664 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33183 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/ha-286241-m04/id_rsa Username:docker}
	I1217 20:52:33.277907  457664 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1217 20:52:33.296972  457664 status.go:176] ha-286241-m04 status: &{Name:ha-286241-m04 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiControlPlane/serial/StopSecondaryNode (13.02s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (0.8s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop
ha_test.go:392: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (0.80s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartSecondaryNode (13.94s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartSecondaryNode
ha_test.go:422: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 node start m02 --alsologtostderr -v 5
ha_test.go:422: (dbg) Done: out/minikube-linux-arm64 -p ha-286241 node start m02 --alsologtostderr -v 5: (12.281857304s)
ha_test.go:430: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 status --alsologtostderr -v 5
ha_test.go:430: (dbg) Done: out/minikube-linux-arm64 -p ha-286241 status --alsologtostderr -v 5: (1.512563684s)
ha_test.go:450: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiControlPlane/serial/RestartSecondaryNode (13.94s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart (1.39s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart
ha_test.go:281: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
ha_test.go:281: (dbg) Done: out/minikube-linux-arm64 profile list --output json: (1.392759669s)
--- PASS: TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart (1.39s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartClusterKeepsNodes (98.86s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartClusterKeepsNodes
ha_test.go:458: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 node list --alsologtostderr -v 5
ha_test.go:464: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 stop --alsologtostderr -v 5
ha_test.go:464: (dbg) Done: out/minikube-linux-arm64 -p ha-286241 stop --alsologtostderr -v 5: (37.541484943s)
ha_test.go:469: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 start --wait true --alsologtostderr -v 5
ha_test.go:469: (dbg) Done: out/minikube-linux-arm64 -p ha-286241 start --wait true --alsologtostderr -v 5: (1m1.140189571s)
ha_test.go:474: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 node list --alsologtostderr -v 5
--- PASS: TestMultiControlPlane/serial/RestartClusterKeepsNodes (98.86s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DeleteSecondaryNode (11.18s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DeleteSecondaryNode
ha_test.go:489: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 node delete m03 --alsologtostderr -v 5
E1217 20:54:31.574931  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:54:36.456992  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:489: (dbg) Done: out/minikube-linux-arm64 -p ha-286241 node delete m03 --alsologtostderr -v 5: (10.223758142s)
ha_test.go:495: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 status --alsologtostderr -v 5
ha_test.go:513: (dbg) Run:  kubectl get nodes
ha_test.go:521: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiControlPlane/serial/DeleteSecondaryNode (11.18s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete (0.85s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete
ha_test.go:392: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete (0.85s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StopCluster (36.48s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StopCluster
ha_test.go:533: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 stop --alsologtostderr -v 5
E1217 20:54:49.014566  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:55:04.162197  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:533: (dbg) Done: out/minikube-linux-arm64 -p ha-286241 stop --alsologtostderr -v 5: (36.359158381s)
ha_test.go:539: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 status --alsologtostderr -v 5
ha_test.go:539: (dbg) Non-zero exit: out/minikube-linux-arm64 -p ha-286241 status --alsologtostderr -v 5: exit status 7 (120.254881ms)

                                                
                                                
-- stdout --
	ha-286241
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-286241-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-286241-m04
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1217 20:55:16.725080  472701 out.go:360] Setting OutFile to fd 1 ...
	I1217 20:55:16.725237  472701 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:55:16.725248  472701 out.go:374] Setting ErrFile to fd 2...
	I1217 20:55:16.725253  472701 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 20:55:16.725504  472701 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 20:55:16.725693  472701 out.go:368] Setting JSON to false
	I1217 20:55:16.725725  472701 mustload.go:66] Loading cluster: ha-286241
	I1217 20:55:16.725831  472701 notify.go:221] Checking for updates...
	I1217 20:55:16.726137  472701 config.go:182] Loaded profile config "ha-286241": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1217 20:55:16.726161  472701 status.go:174] checking status of ha-286241 ...
	I1217 20:55:16.726716  472701 cli_runner.go:164] Run: docker container inspect ha-286241 --format={{.State.Status}}
	I1217 20:55:16.746076  472701 status.go:371] ha-286241 host status = "Stopped" (err=<nil>)
	I1217 20:55:16.746098  472701 status.go:384] host is not running, skipping remaining checks
	I1217 20:55:16.746105  472701 status.go:176] ha-286241 status: &{Name:ha-286241 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1217 20:55:16.746136  472701 status.go:174] checking status of ha-286241-m02 ...
	I1217 20:55:16.746435  472701 cli_runner.go:164] Run: docker container inspect ha-286241-m02 --format={{.State.Status}}
	I1217 20:55:16.776984  472701 status.go:371] ha-286241-m02 host status = "Stopped" (err=<nil>)
	I1217 20:55:16.777007  472701 status.go:384] host is not running, skipping remaining checks
	I1217 20:55:16.777014  472701 status.go:176] ha-286241-m02 status: &{Name:ha-286241-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1217 20:55:16.777033  472701 status.go:174] checking status of ha-286241-m04 ...
	I1217 20:55:16.777323  472701 cli_runner.go:164] Run: docker container inspect ha-286241-m04 --format={{.State.Status}}
	I1217 20:55:16.794472  472701 status.go:371] ha-286241-m04 host status = "Stopped" (err=<nil>)
	I1217 20:55:16.794491  472701 status.go:384] host is not running, skipping remaining checks
	I1217 20:55:16.794510  472701 status.go:176] ha-286241-m04 status: &{Name:ha-286241-m04 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiControlPlane/serial/StopCluster (36.48s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartCluster (59.27s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartCluster
ha_test.go:562: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 start --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd
ha_test.go:562: (dbg) Done: out/minikube-linux-arm64 -p ha-286241 start --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd: (58.261355823s)
ha_test.go:568: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 status --alsologtostderr -v 5
ha_test.go:586: (dbg) Run:  kubectl get nodes
ha_test.go:594: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiControlPlane/serial/RestartCluster (59.27s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterClusterRestart (0.82s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterClusterRestart
ha_test.go:392: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterClusterRestart (0.82s)

                                                
                                    
x
+
TestMultiControlPlane/serial/AddSecondaryNode (83.92s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/AddSecondaryNode
ha_test.go:607: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 node add --control-plane --alsologtostderr -v 5
E1217 20:56:28.506723  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:607: (dbg) Done: out/minikube-linux-arm64 -p ha-286241 node add --control-plane --alsologtostderr -v 5: (1m22.739938636s)
ha_test.go:613: (dbg) Run:  out/minikube-linux-arm64 -p ha-286241 status --alsologtostderr -v 5
ha_test.go:613: (dbg) Done: out/minikube-linux-arm64 -p ha-286241 status --alsologtostderr -v 5: (1.178729123s)
--- PASS: TestMultiControlPlane/serial/AddSecondaryNode (83.92s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd (1.08s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd
ha_test.go:281: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
ha_test.go:281: (dbg) Done: out/minikube-linux-arm64 profile list --output json: (1.081780338s)
--- PASS: TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd (1.08s)

                                                
                                    
x
+
TestJSONOutput/start/Command (48.22s)

                                                
                                                
=== RUN   TestJSONOutput/start/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 start -p json-output-980326 --output=json --user=testUser --memory=3072 --wait=true --driver=docker  --container-runtime=containerd
json_output_test.go:63: (dbg) Done: out/minikube-linux-arm64 start -p json-output-980326 --output=json --user=testUser --memory=3072 --wait=true --driver=docker  --container-runtime=containerd: (48.218850891s)
--- PASS: TestJSONOutput/start/Command (48.22s)

                                                
                                    
x
+
TestJSONOutput/start/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/Audit
--- PASS: TestJSONOutput/start/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/start/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/start/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/Command (0.77s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 pause -p json-output-980326 --output=json --user=testUser
--- PASS: TestJSONOutput/pause/Command (0.77s)

                                                
                                    
x
+
TestJSONOutput/pause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Audit
--- PASS: TestJSONOutput/pause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/Command (0.62s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 unpause -p json-output-980326 --output=json --user=testUser
--- PASS: TestJSONOutput/unpause/Command (0.62s)

                                                
                                    
x
+
TestJSONOutput/unpause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Audit
--- PASS: TestJSONOutput/unpause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/Command (5.97s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 stop -p json-output-980326 --output=json --user=testUser
json_output_test.go:63: (dbg) Done: out/minikube-linux-arm64 stop -p json-output-980326 --output=json --user=testUser: (5.972489816s)
--- PASS: TestJSONOutput/stop/Command (5.97s)

                                                
                                    
x
+
TestJSONOutput/stop/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Audit
--- PASS: TestJSONOutput/stop/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestErrorJSONOutput (0.24s)

                                                
                                                
=== RUN   TestErrorJSONOutput
json_output_test.go:160: (dbg) Run:  out/minikube-linux-arm64 start -p json-output-error-139013 --memory=3072 --output=json --wait=true --driver=fail
json_output_test.go:160: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p json-output-error-139013 --memory=3072 --output=json --wait=true --driver=fail: exit status 56 (93.347599ms)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"a9c8682b-0b01-4339-88c0-0764bdbef73b","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[json-output-error-139013] minikube v1.37.0 on Ubuntu 20.04 (arm64)","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"52a1df67-5171-4831-9dbe-2e83f29f4381","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=21808"}}
	{"specversion":"1.0","id":"f03a817d-6fae-4e25-9aac-5994f2557022","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"306b1b72-856e-441f-8fdf-6a3c1a74394e","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig"}}
	{"specversion":"1.0","id":"e8de7659-562a-4ae3-b829-d7fb392b3bc9","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube"}}
	{"specversion":"1.0","id":"16e6b8da-c540-4cb6-8871-5108ed435f66","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-linux-arm64"}}
	{"specversion":"1.0","id":"545d4ee5-d935-4fcb-ac71-cc1ba1d20508","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"076f1ab0-8fd1-4ce9-927b-6aa386d6f02b","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"56","issues":"","message":"The driver 'fail' is not supported on linux/arm64","name":"DRV_UNSUPPORTED_OS","url":""}}

                                                
                                                
-- /stdout --
helpers_test.go:176: Cleaning up "json-output-error-139013" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p json-output-error-139013
--- PASS: TestErrorJSONOutput (0.24s)

                                                
                                    
x
+
TestKicCustomNetwork/create_custom_network (41.51s)

                                                
                                                
=== RUN   TestKicCustomNetwork/create_custom_network
kic_custom_network_test.go:57: (dbg) Run:  out/minikube-linux-arm64 start -p docker-network-286405 --network=
kic_custom_network_test.go:57: (dbg) Done: out/minikube-linux-arm64 start -p docker-network-286405 --network=: (39.270225906s)
kic_custom_network_test.go:150: (dbg) Run:  docker network ls --format {{.Name}}
helpers_test.go:176: Cleaning up "docker-network-286405" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p docker-network-286405
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p docker-network-286405: (2.212153648s)
--- PASS: TestKicCustomNetwork/create_custom_network (41.51s)

                                                
                                    
x
+
TestKicCustomNetwork/use_default_bridge_network (40.69s)

                                                
                                                
=== RUN   TestKicCustomNetwork/use_default_bridge_network
kic_custom_network_test.go:57: (dbg) Run:  out/minikube-linux-arm64 start -p docker-network-455007 --network=bridge
E1217 20:59:36.457147  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 20:59:49.015145  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
kic_custom_network_test.go:57: (dbg) Done: out/minikube-linux-arm64 start -p docker-network-455007 --network=bridge: (38.489594111s)
kic_custom_network_test.go:150: (dbg) Run:  docker network ls --format {{.Name}}
helpers_test.go:176: Cleaning up "docker-network-455007" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p docker-network-455007
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p docker-network-455007: (2.169812943s)
--- PASS: TestKicCustomNetwork/use_default_bridge_network (40.69s)

                                                
                                    
x
+
TestKicExistingNetwork (36.12s)

                                                
                                                
=== RUN   TestKicExistingNetwork
I1217 21:00:14.668811  369461 cli_runner.go:164] Run: docker network inspect existing-network --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
W1217 21:00:14.689429  369461 cli_runner.go:211] docker network inspect existing-network --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
I1217 21:00:14.689503  369461 network_create.go:284] running [docker network inspect existing-network] to gather additional debugging logs...
I1217 21:00:14.689521  369461 cli_runner.go:164] Run: docker network inspect existing-network
W1217 21:00:14.705070  369461 cli_runner.go:211] docker network inspect existing-network returned with exit code 1
I1217 21:00:14.705104  369461 network_create.go:287] error running [docker network inspect existing-network]: docker network inspect existing-network: exit status 1
stdout:
[]

                                                
                                                
stderr:
Error response from daemon: network existing-network not found
I1217 21:00:14.705119  369461 network_create.go:289] output of [docker network inspect existing-network]: -- stdout --
[]

                                                
                                                
-- /stdout --
** stderr ** 
Error response from daemon: network existing-network not found

                                                
                                                
** /stderr **
I1217 21:00:14.705216  369461 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
I1217 21:00:14.722238  369461 network.go:211] skipping subnet 192.168.49.0/24 that is taken: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName:br-3e64c97094b7 IfaceIPv4:192.168.49.1 IfaceMTU:1500 IfaceMAC:1a:a4:55:13:27:1d} reservation:<nil>}
I1217 21:00:14.722547  369461 network.go:206] using free private subnet 192.168.58.0/24: &{IP:192.168.58.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.58.0/24 Gateway:192.168.58.1 ClientMin:192.168.58.2 ClientMax:192.168.58.254 Broadcast:192.168.58.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x40015ba250}
I1217 21:00:14.722570  369461 network_create.go:124] attempt to create docker network existing-network 192.168.58.0/24 with gateway 192.168.58.1 and MTU of 1500 ...
I1217 21:00:14.722619  369461 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.58.0/24 --gateway=192.168.58.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=existing-network existing-network
I1217 21:00:14.781759  369461 network_create.go:108] docker network existing-network 192.168.58.0/24 created
kic_custom_network_test.go:150: (dbg) Run:  docker network ls --format {{.Name}}
kic_custom_network_test.go:93: (dbg) Run:  out/minikube-linux-arm64 start -p existing-network-783556 --network=existing-network
kic_custom_network_test.go:93: (dbg) Done: out/minikube-linux-arm64 start -p existing-network-783556 --network=existing-network: (33.860717861s)
helpers_test.go:176: Cleaning up "existing-network-783556" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p existing-network-783556
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p existing-network-783556: (2.112413801s)
I1217 21:00:50.771874  369461 cli_runner.go:164] Run: docker network ls --filter=label=existing-network --format {{.Name}}
--- PASS: TestKicExistingNetwork (36.12s)

                                                
                                    
x
+
TestKicCustomSubnet (35.66s)

                                                
                                                
=== RUN   TestKicCustomSubnet
kic_custom_network_test.go:112: (dbg) Run:  out/minikube-linux-arm64 start -p custom-subnet-103028 --subnet=192.168.60.0/24
kic_custom_network_test.go:112: (dbg) Done: out/minikube-linux-arm64 start -p custom-subnet-103028 --subnet=192.168.60.0/24: (33.355074645s)
kic_custom_network_test.go:161: (dbg) Run:  docker network inspect custom-subnet-103028 --format "{{(index .IPAM.Config 0).Subnet}}"
helpers_test.go:176: Cleaning up "custom-subnet-103028" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p custom-subnet-103028
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p custom-subnet-103028: (2.283815696s)
--- PASS: TestKicCustomSubnet (35.66s)

                                                
                                    
x
+
TestKicStaticIP (35.88s)

                                                
                                                
=== RUN   TestKicStaticIP
kic_custom_network_test.go:132: (dbg) Run:  out/minikube-linux-arm64 start -p static-ip-745942 --static-ip=192.168.200.200
E1217 21:01:28.508025  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
kic_custom_network_test.go:132: (dbg) Done: out/minikube-linux-arm64 start -p static-ip-745942 --static-ip=192.168.200.200: (33.154286918s)
kic_custom_network_test.go:138: (dbg) Run:  out/minikube-linux-arm64 -p static-ip-745942 ip
helpers_test.go:176: Cleaning up "static-ip-745942" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p static-ip-745942
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p static-ip-745942: (2.553137903s)
--- PASS: TestKicStaticIP (35.88s)

                                                
                                    
x
+
TestMainNoArgs (0.05s)

                                                
                                                
=== RUN   TestMainNoArgs
main_test.go:70: (dbg) Run:  out/minikube-linux-arm64
--- PASS: TestMainNoArgs (0.05s)

                                                
                                    
x
+
TestMinikubeProfile (70.3s)

                                                
                                                
=== RUN   TestMinikubeProfile
minikube_profile_test.go:44: (dbg) Run:  out/minikube-linux-arm64 start -p first-014977 --driver=docker  --container-runtime=containerd
minikube_profile_test.go:44: (dbg) Done: out/minikube-linux-arm64 start -p first-014977 --driver=docker  --container-runtime=containerd: (30.667753348s)
minikube_profile_test.go:44: (dbg) Run:  out/minikube-linux-arm64 start -p second-017832 --driver=docker  --container-runtime=containerd
minikube_profile_test.go:44: (dbg) Done: out/minikube-linux-arm64 start -p second-017832 --driver=docker  --container-runtime=containerd: (33.612720563s)
minikube_profile_test.go:51: (dbg) Run:  out/minikube-linux-arm64 profile first-014977
minikube_profile_test.go:55: (dbg) Run:  out/minikube-linux-arm64 profile list -ojson
minikube_profile_test.go:51: (dbg) Run:  out/minikube-linux-arm64 profile second-017832
minikube_profile_test.go:55: (dbg) Run:  out/minikube-linux-arm64 profile list -ojson
helpers_test.go:176: Cleaning up "second-017832" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p second-017832
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p second-017832: (2.243617475s)
helpers_test.go:176: Cleaning up "first-014977" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p first-014977
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p first-014977: (2.384997615s)
--- PASS: TestMinikubeProfile (70.30s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountFirst (8.74s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountFirst
mount_start_test.go:118: (dbg) Run:  out/minikube-linux-arm64 start -p mount-start-1-069116 --memory=3072 --mount-string /tmp/TestMountStartserial177404070/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd
mount_start_test.go:118: (dbg) Done: out/minikube-linux-arm64 start -p mount-start-1-069116 --memory=3072 --mount-string /tmp/TestMountStartserial177404070/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd: (7.73813018s)
--- PASS: TestMountStart/serial/StartWithMountFirst (8.74s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountFirst (0.27s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountFirst
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-1-069116 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountFirst (0.27s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountSecond (8.76s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountSecond
mount_start_test.go:118: (dbg) Run:  out/minikube-linux-arm64 start -p mount-start-2-071093 --memory=3072 --mount-string /tmp/TestMountStartserial177404070/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd
mount_start_test.go:118: (dbg) Done: out/minikube-linux-arm64 start -p mount-start-2-071093 --memory=3072 --mount-string /tmp/TestMountStartserial177404070/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd: (7.756203782s)
--- PASS: TestMountStart/serial/StartWithMountSecond (8.76s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountSecond (0.27s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountSecond
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-2-071093 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountSecond (0.27s)

                                                
                                    
x
+
TestMountStart/serial/DeleteFirst (1.69s)

                                                
                                                
=== RUN   TestMountStart/serial/DeleteFirst
pause_test.go:132: (dbg) Run:  out/minikube-linux-arm64 delete -p mount-start-1-069116 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-linux-arm64 delete -p mount-start-1-069116 --alsologtostderr -v=5: (1.692603672s)
--- PASS: TestMountStart/serial/DeleteFirst (1.69s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostDelete (0.26s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostDelete
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-2-071093 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountPostDelete (0.26s)

                                                
                                    
x
+
TestMountStart/serial/Stop (1.29s)

                                                
                                                
=== RUN   TestMountStart/serial/Stop
mount_start_test.go:196: (dbg) Run:  out/minikube-linux-arm64 stop -p mount-start-2-071093
mount_start_test.go:196: (dbg) Done: out/minikube-linux-arm64 stop -p mount-start-2-071093: (1.289618702s)
--- PASS: TestMountStart/serial/Stop (1.29s)

                                                
                                    
x
+
TestMountStart/serial/RestartStopped (7.98s)

                                                
                                                
=== RUN   TestMountStart/serial/RestartStopped
mount_start_test.go:207: (dbg) Run:  out/minikube-linux-arm64 start -p mount-start-2-071093
mount_start_test.go:207: (dbg) Done: out/minikube-linux-arm64 start -p mount-start-2-071093: (6.979433299s)
--- PASS: TestMountStart/serial/RestartStopped (7.98s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostStop (0.31s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostStop
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-2-071093 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountPostStop (0.31s)

                                                
                                    
x
+
TestMultiNode/serial/FreshStart2Nodes (81.68s)

                                                
                                                
=== RUN   TestMultiNode/serial/FreshStart2Nodes
multinode_test.go:96: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-883974 --wait=true --memory=3072 --nodes=2 -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd
E1217 21:04:32.095156  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 21:04:36.456497  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 21:04:49.015134  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:96: (dbg) Done: out/minikube-linux-arm64 start -p multinode-883974 --wait=true --memory=3072 --nodes=2 -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd: (1m21.145565249s)
multinode_test.go:102: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 status --alsologtostderr
--- PASS: TestMultiNode/serial/FreshStart2Nodes (81.68s)

                                                
                                    
x
+
TestMultiNode/serial/DeployApp2Nodes (5.36s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeployApp2Nodes
multinode_test.go:493: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-883974 -- apply -f ./testdata/multinodes/multinode-pod-dns-test.yaml
multinode_test.go:498: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-883974 -- rollout status deployment/busybox
multinode_test.go:498: (dbg) Done: out/minikube-linux-arm64 kubectl -p multinode-883974 -- rollout status deployment/busybox: (3.460122621s)
multinode_test.go:505: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-883974 -- get pods -o jsonpath='{.items[*].status.podIP}'
multinode_test.go:528: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-883974 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:536: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-883974 -- exec busybox-7b57f96db7-6gjvd -- nslookup kubernetes.io
multinode_test.go:536: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-883974 -- exec busybox-7b57f96db7-9wd2w -- nslookup kubernetes.io
multinode_test.go:546: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-883974 -- exec busybox-7b57f96db7-6gjvd -- nslookup kubernetes.default
multinode_test.go:546: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-883974 -- exec busybox-7b57f96db7-9wd2w -- nslookup kubernetes.default
multinode_test.go:554: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-883974 -- exec busybox-7b57f96db7-6gjvd -- nslookup kubernetes.default.svc.cluster.local
multinode_test.go:554: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-883974 -- exec busybox-7b57f96db7-9wd2w -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiNode/serial/DeployApp2Nodes (5.36s)

                                                
                                    
x
+
TestMultiNode/serial/PingHostFrom2Pods (1.06s)

                                                
                                                
=== RUN   TestMultiNode/serial/PingHostFrom2Pods
multinode_test.go:564: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-883974 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:572: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-883974 -- exec busybox-7b57f96db7-6gjvd -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-883974 -- exec busybox-7b57f96db7-6gjvd -- sh -c "ping -c 1 192.168.67.1"
multinode_test.go:572: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-883974 -- exec busybox-7b57f96db7-9wd2w -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-883974 -- exec busybox-7b57f96db7-9wd2w -- sh -c "ping -c 1 192.168.67.1"
--- PASS: TestMultiNode/serial/PingHostFrom2Pods (1.06s)

                                                
                                    
x
+
TestMultiNode/serial/AddNode (29.52s)

                                                
                                                
=== RUN   TestMultiNode/serial/AddNode
multinode_test.go:121: (dbg) Run:  out/minikube-linux-arm64 node add -p multinode-883974 -v=5 --alsologtostderr
multinode_test.go:121: (dbg) Done: out/minikube-linux-arm64 node add -p multinode-883974 -v=5 --alsologtostderr: (28.823360077s)
multinode_test.go:127: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 status --alsologtostderr
--- PASS: TestMultiNode/serial/AddNode (29.52s)

                                                
                                    
x
+
TestMultiNode/serial/MultiNodeLabels (0.09s)

                                                
                                                
=== RUN   TestMultiNode/serial/MultiNodeLabels
multinode_test.go:221: (dbg) Run:  kubectl --context multinode-883974 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiNode/serial/MultiNodeLabels (0.09s)

                                                
                                    
x
+
TestMultiNode/serial/ProfileList (0.73s)

                                                
                                                
=== RUN   TestMultiNode/serial/ProfileList
multinode_test.go:143: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiNode/serial/ProfileList (0.73s)

                                                
                                    
x
+
TestMultiNode/serial/CopyFile (10.3s)

                                                
                                                
=== RUN   TestMultiNode/serial/CopyFile
multinode_test.go:184: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 status --output json --alsologtostderr
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 cp testdata/cp-test.txt multinode-883974:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 ssh -n multinode-883974 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 cp multinode-883974:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile1639180123/001/cp-test_multinode-883974.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 ssh -n multinode-883974 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 cp multinode-883974:/home/docker/cp-test.txt multinode-883974-m02:/home/docker/cp-test_multinode-883974_multinode-883974-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 ssh -n multinode-883974 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 ssh -n multinode-883974-m02 "sudo cat /home/docker/cp-test_multinode-883974_multinode-883974-m02.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 cp multinode-883974:/home/docker/cp-test.txt multinode-883974-m03:/home/docker/cp-test_multinode-883974_multinode-883974-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 ssh -n multinode-883974 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 ssh -n multinode-883974-m03 "sudo cat /home/docker/cp-test_multinode-883974_multinode-883974-m03.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 cp testdata/cp-test.txt multinode-883974-m02:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 ssh -n multinode-883974-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 cp multinode-883974-m02:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile1639180123/001/cp-test_multinode-883974-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 ssh -n multinode-883974-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 cp multinode-883974-m02:/home/docker/cp-test.txt multinode-883974:/home/docker/cp-test_multinode-883974-m02_multinode-883974.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 ssh -n multinode-883974-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 ssh -n multinode-883974 "sudo cat /home/docker/cp-test_multinode-883974-m02_multinode-883974.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 cp multinode-883974-m02:/home/docker/cp-test.txt multinode-883974-m03:/home/docker/cp-test_multinode-883974-m02_multinode-883974-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 ssh -n multinode-883974-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 ssh -n multinode-883974-m03 "sudo cat /home/docker/cp-test_multinode-883974-m02_multinode-883974-m03.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 cp testdata/cp-test.txt multinode-883974-m03:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 ssh -n multinode-883974-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 cp multinode-883974-m03:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile1639180123/001/cp-test_multinode-883974-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 ssh -n multinode-883974-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 cp multinode-883974-m03:/home/docker/cp-test.txt multinode-883974:/home/docker/cp-test_multinode-883974-m03_multinode-883974.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 ssh -n multinode-883974-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 ssh -n multinode-883974 "sudo cat /home/docker/cp-test_multinode-883974-m03_multinode-883974.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 cp multinode-883974-m03:/home/docker/cp-test.txt multinode-883974-m02:/home/docker/cp-test_multinode-883974-m03_multinode-883974-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 ssh -n multinode-883974-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 ssh -n multinode-883974-m02 "sudo cat /home/docker/cp-test_multinode-883974-m03_multinode-883974-m02.txt"
--- PASS: TestMultiNode/serial/CopyFile (10.30s)

                                                
                                    
x
+
TestMultiNode/serial/StopNode (2.43s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopNode
multinode_test.go:248: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 node stop m03
multinode_test.go:248: (dbg) Done: out/minikube-linux-arm64 -p multinode-883974 node stop m03: (1.32621523s)
multinode_test.go:254: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 status
multinode_test.go:254: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-883974 status: exit status 7 (537.856275ms)

                                                
                                                
-- stdout --
	multinode-883974
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-883974-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-883974-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:261: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 status --alsologtostderr
multinode_test.go:261: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-883974 status --alsologtostderr: exit status 7 (560.993314ms)

                                                
                                                
-- stdout --
	multinode-883974
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-883974-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-883974-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1217 21:05:54.808146  526694 out.go:360] Setting OutFile to fd 1 ...
	I1217 21:05:54.808348  526694 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 21:05:54.808399  526694 out.go:374] Setting ErrFile to fd 2...
	I1217 21:05:54.808422  526694 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 21:05:54.808673  526694 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 21:05:54.808894  526694 out.go:368] Setting JSON to false
	I1217 21:05:54.808953  526694 mustload.go:66] Loading cluster: multinode-883974
	I1217 21:05:54.809039  526694 notify.go:221] Checking for updates...
	I1217 21:05:54.809400  526694 config.go:182] Loaded profile config "multinode-883974": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1217 21:05:54.809437  526694 status.go:174] checking status of multinode-883974 ...
	I1217 21:05:54.810277  526694 cli_runner.go:164] Run: docker container inspect multinode-883974 --format={{.State.Status}}
	I1217 21:05:54.830024  526694 status.go:371] multinode-883974 host status = "Running" (err=<nil>)
	I1217 21:05:54.830051  526694 host.go:66] Checking if "multinode-883974" exists ...
	I1217 21:05:54.830353  526694 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" multinode-883974
	I1217 21:05:54.855835  526694 host.go:66] Checking if "multinode-883974" exists ...
	I1217 21:05:54.856194  526694 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1217 21:05:54.856301  526694 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" multinode-883974
	I1217 21:05:54.885552  526694 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33288 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/multinode-883974/id_rsa Username:docker}
	I1217 21:05:54.977633  526694 ssh_runner.go:195] Run: systemctl --version
	I1217 21:05:54.984561  526694 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1217 21:05:54.998681  526694 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 21:05:55.074868  526694 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:3 ContainersRunning:2 ContainersPaused:0 ContainersStopped:1 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:50 OomKillDisable:true NGoroutines:62 SystemTime:2025-12-17 21:05:55.063718404 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 21:05:55.075399  526694 kubeconfig.go:125] found "multinode-883974" server: "https://192.168.67.2:8443"
	I1217 21:05:55.075432  526694 api_server.go:166] Checking apiserver status ...
	I1217 21:05:55.075473  526694 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1217 21:05:55.088579  526694 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1461/cgroup
	I1217 21:05:55.097914  526694 api_server.go:182] apiserver freezer: "6:freezer:/docker/91bd41c24f065025409c6213d6e4279ca5dc5e63847c3588a033c8a2fd1f28c8/kubepods/burstable/pod63e512dd34099a80054f4ed03ba1a126/bac8e97eb13c324fd4d6ec6a36aa63ea45cfdb27d53f41b7f540c0247a1ec851"
	I1217 21:05:55.097991  526694 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/91bd41c24f065025409c6213d6e4279ca5dc5e63847c3588a033c8a2fd1f28c8/kubepods/burstable/pod63e512dd34099a80054f4ed03ba1a126/bac8e97eb13c324fd4d6ec6a36aa63ea45cfdb27d53f41b7f540c0247a1ec851/freezer.state
	I1217 21:05:55.106319  526694 api_server.go:204] freezer state: "THAWED"
	I1217 21:05:55.106351  526694 api_server.go:253] Checking apiserver healthz at https://192.168.67.2:8443/healthz ...
	I1217 21:05:55.114937  526694 api_server.go:279] https://192.168.67.2:8443/healthz returned 200:
	ok
	I1217 21:05:55.114967  526694 status.go:463] multinode-883974 apiserver status = Running (err=<nil>)
	I1217 21:05:55.114978  526694 status.go:176] multinode-883974 status: &{Name:multinode-883974 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1217 21:05:55.115008  526694 status.go:174] checking status of multinode-883974-m02 ...
	I1217 21:05:55.115337  526694 cli_runner.go:164] Run: docker container inspect multinode-883974-m02 --format={{.State.Status}}
	I1217 21:05:55.133410  526694 status.go:371] multinode-883974-m02 host status = "Running" (err=<nil>)
	I1217 21:05:55.133439  526694 host.go:66] Checking if "multinode-883974-m02" exists ...
	I1217 21:05:55.133751  526694 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" multinode-883974-m02
	I1217 21:05:55.152218  526694 host.go:66] Checking if "multinode-883974-m02" exists ...
	I1217 21:05:55.152574  526694 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1217 21:05:55.152623  526694 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" multinode-883974-m02
	I1217 21:05:55.178266  526694 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33293 SSHKeyPath:/home/jenkins/minikube-integration/21808-367595/.minikube/machines/multinode-883974-m02/id_rsa Username:docker}
	I1217 21:05:55.269608  526694 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1217 21:05:55.282849  526694 status.go:176] multinode-883974-m02 status: &{Name:multinode-883974-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I1217 21:05:55.282893  526694 status.go:174] checking status of multinode-883974-m03 ...
	I1217 21:05:55.283207  526694 cli_runner.go:164] Run: docker container inspect multinode-883974-m03 --format={{.State.Status}}
	I1217 21:05:55.305494  526694 status.go:371] multinode-883974-m03 host status = "Stopped" (err=<nil>)
	I1217 21:05:55.305516  526694 status.go:384] host is not running, skipping remaining checks
	I1217 21:05:55.305523  526694 status.go:176] multinode-883974-m03 status: &{Name:multinode-883974-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopNode (2.43s)

                                                
                                    
x
+
TestMultiNode/serial/StartAfterStop (8.18s)

                                                
                                                
=== RUN   TestMultiNode/serial/StartAfterStop
multinode_test.go:282: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 node start m03 -v=5 --alsologtostderr
E1217 21:05:59.524394  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:282: (dbg) Done: out/minikube-linux-arm64 -p multinode-883974 node start m03 -v=5 --alsologtostderr: (7.407000016s)
multinode_test.go:290: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 status -v=5 --alsologtostderr
multinode_test.go:306: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiNode/serial/StartAfterStop (8.18s)

                                                
                                    
x
+
TestMultiNode/serial/RestartKeepsNodes (81.48s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartKeepsNodes
multinode_test.go:314: (dbg) Run:  out/minikube-linux-arm64 node list -p multinode-883974
multinode_test.go:321: (dbg) Run:  out/minikube-linux-arm64 stop -p multinode-883974
E1217 21:06:28.507569  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:321: (dbg) Done: out/minikube-linux-arm64 stop -p multinode-883974: (25.13515365s)
multinode_test.go:326: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-883974 --wait=true -v=5 --alsologtostderr
multinode_test.go:326: (dbg) Done: out/minikube-linux-arm64 start -p multinode-883974 --wait=true -v=5 --alsologtostderr: (56.222792874s)
multinode_test.go:331: (dbg) Run:  out/minikube-linux-arm64 node list -p multinode-883974
--- PASS: TestMultiNode/serial/RestartKeepsNodes (81.48s)

                                                
                                    
x
+
TestMultiNode/serial/DeleteNode (5.68s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeleteNode
multinode_test.go:416: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 node delete m03
multinode_test.go:416: (dbg) Done: out/minikube-linux-arm64 -p multinode-883974 node delete m03: (4.958733104s)
multinode_test.go:422: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 status --alsologtostderr
multinode_test.go:436: (dbg) Run:  kubectl get nodes
multinode_test.go:444: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/DeleteNode (5.68s)

                                                
                                    
x
+
TestMultiNode/serial/StopMultiNode (24.11s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopMultiNode
multinode_test.go:345: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 stop
multinode_test.go:345: (dbg) Done: out/minikube-linux-arm64 -p multinode-883974 stop: (23.914017535s)
multinode_test.go:351: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 status
multinode_test.go:351: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-883974 status: exit status 7 (98.436305ms)

                                                
                                                
-- stdout --
	multinode-883974
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-883974-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:358: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 status --alsologtostderr
multinode_test.go:358: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-883974 status --alsologtostderr: exit status 7 (94.705328ms)

                                                
                                                
-- stdout --
	multinode-883974
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-883974-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1217 21:07:54.714359  535575 out.go:360] Setting OutFile to fd 1 ...
	I1217 21:07:54.714479  535575 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 21:07:54.714489  535575 out.go:374] Setting ErrFile to fd 2...
	I1217 21:07:54.714494  535575 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 21:07:54.714732  535575 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 21:07:54.714917  535575 out.go:368] Setting JSON to false
	I1217 21:07:54.714955  535575 mustload.go:66] Loading cluster: multinode-883974
	I1217 21:07:54.715025  535575 notify.go:221] Checking for updates...
	I1217 21:07:54.715887  535575 config.go:182] Loaded profile config "multinode-883974": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1217 21:07:54.715917  535575 status.go:174] checking status of multinode-883974 ...
	I1217 21:07:54.716472  535575 cli_runner.go:164] Run: docker container inspect multinode-883974 --format={{.State.Status}}
	I1217 21:07:54.734729  535575 status.go:371] multinode-883974 host status = "Stopped" (err=<nil>)
	I1217 21:07:54.734751  535575 status.go:384] host is not running, skipping remaining checks
	I1217 21:07:54.734759  535575 status.go:176] multinode-883974 status: &{Name:multinode-883974 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1217 21:07:54.734790  535575 status.go:174] checking status of multinode-883974-m02 ...
	I1217 21:07:54.735174  535575 cli_runner.go:164] Run: docker container inspect multinode-883974-m02 --format={{.State.Status}}
	I1217 21:07:54.760623  535575 status.go:371] multinode-883974-m02 host status = "Stopped" (err=<nil>)
	I1217 21:07:54.760656  535575 status.go:384] host is not running, skipping remaining checks
	I1217 21:07:54.760674  535575 status.go:176] multinode-883974-m02 status: &{Name:multinode-883974-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopMultiNode (24.11s)

                                                
                                    
x
+
TestMultiNode/serial/RestartMultiNode (56.36s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartMultiNode
multinode_test.go:376: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-883974 --wait=true -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd
multinode_test.go:376: (dbg) Done: out/minikube-linux-arm64 start -p multinode-883974 --wait=true -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd: (55.668341585s)
multinode_test.go:382: (dbg) Run:  out/minikube-linux-arm64 -p multinode-883974 status --alsologtostderr
multinode_test.go:396: (dbg) Run:  kubectl get nodes
multinode_test.go:404: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/RestartMultiNode (56.36s)

                                                
                                    
x
+
TestMultiNode/serial/ValidateNameConflict (36.14s)

                                                
                                                
=== RUN   TestMultiNode/serial/ValidateNameConflict
multinode_test.go:455: (dbg) Run:  out/minikube-linux-arm64 node list -p multinode-883974
multinode_test.go:464: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-883974-m02 --driver=docker  --container-runtime=containerd
multinode_test.go:464: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p multinode-883974-m02 --driver=docker  --container-runtime=containerd: exit status 14 (94.809957ms)

                                                
                                                
-- stdout --
	* [multinode-883974-m02] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21808
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Profile name 'multinode-883974-m02' is duplicated with machine name 'multinode-883974-m02' in profile 'multinode-883974'
	X Exiting due to MK_USAGE: Profile name should be unique

                                                
                                                
** /stderr **
multinode_test.go:472: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-883974-m03 --driver=docker  --container-runtime=containerd
multinode_test.go:472: (dbg) Done: out/minikube-linux-arm64 start -p multinode-883974-m03 --driver=docker  --container-runtime=containerd: (33.57579126s)
multinode_test.go:479: (dbg) Run:  out/minikube-linux-arm64 node add -p multinode-883974
multinode_test.go:479: (dbg) Non-zero exit: out/minikube-linux-arm64 node add -p multinode-883974: exit status 80 (323.260262ms)

                                                
                                                
-- stdout --
	* Adding node m03 to cluster multinode-883974 as [worker]
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_NODE_ADD: failed to add node: Node multinode-883974-m03 already exists in multinode-883974-m03 profile
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log                    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
multinode_test.go:484: (dbg) Run:  out/minikube-linux-arm64 delete -p multinode-883974-m03
multinode_test.go:484: (dbg) Done: out/minikube-linux-arm64 delete -p multinode-883974-m03: (2.086581457s)
--- PASS: TestMultiNode/serial/ValidateNameConflict (36.14s)

                                                
                                    
x
+
TestPreload (119.86s)

                                                
                                                
=== RUN   TestPreload
preload_test.go:41: (dbg) Run:  out/minikube-linux-arm64 start -p test-preload-947976 --memory=3072 --alsologtostderr --wait=true --preload=false --driver=docker  --container-runtime=containerd
E1217 21:09:36.457877  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 21:09:49.015150  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
preload_test.go:41: (dbg) Done: out/minikube-linux-arm64 start -p test-preload-947976 --memory=3072 --alsologtostderr --wait=true --preload=false --driver=docker  --container-runtime=containerd: (58.89076942s)
preload_test.go:49: (dbg) Run:  out/minikube-linux-arm64 -p test-preload-947976 image pull gcr.io/k8s-minikube/busybox
preload_test.go:49: (dbg) Done: out/minikube-linux-arm64 -p test-preload-947976 image pull gcr.io/k8s-minikube/busybox: (2.502953169s)
preload_test.go:55: (dbg) Run:  out/minikube-linux-arm64 stop -p test-preload-947976
preload_test.go:55: (dbg) Done: out/minikube-linux-arm64 stop -p test-preload-947976: (5.906285178s)
preload_test.go:63: (dbg) Run:  out/minikube-linux-arm64 start -p test-preload-947976 --preload=true --alsologtostderr -v=1 --wait=true --driver=docker  --container-runtime=containerd
E1217 21:11:11.576387  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 21:11:28.507632  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
preload_test.go:63: (dbg) Done: out/minikube-linux-arm64 start -p test-preload-947976 --preload=true --alsologtostderr -v=1 --wait=true --driver=docker  --container-runtime=containerd: (49.86532436s)
preload_test.go:68: (dbg) Run:  out/minikube-linux-arm64 -p test-preload-947976 image list
helpers_test.go:176: Cleaning up "test-preload-947976" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p test-preload-947976
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p test-preload-947976: (2.440787419s)
--- PASS: TestPreload (119.86s)

                                                
                                    
x
+
TestScheduledStopUnix (105.04s)

                                                
                                                
=== RUN   TestScheduledStopUnix
scheduled_stop_test.go:128: (dbg) Run:  out/minikube-linux-arm64 start -p scheduled-stop-905103 --memory=3072 --driver=docker  --container-runtime=containerd
scheduled_stop_test.go:128: (dbg) Done: out/minikube-linux-arm64 start -p scheduled-stop-905103 --memory=3072 --driver=docker  --container-runtime=containerd: (29.160414379s)
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-905103 --schedule 5m -v=5 --alsologtostderr
minikube stop output:

                                                
                                                
** stderr ** 
	I1217 21:12:00.619107  551650 out.go:360] Setting OutFile to fd 1 ...
	I1217 21:12:00.619252  551650 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 21:12:00.619264  551650 out.go:374] Setting ErrFile to fd 2...
	I1217 21:12:00.619270  551650 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 21:12:00.619573  551650 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 21:12:00.619883  551650 out.go:368] Setting JSON to false
	I1217 21:12:00.620045  551650 mustload.go:66] Loading cluster: scheduled-stop-905103
	I1217 21:12:00.620459  551650 config.go:182] Loaded profile config "scheduled-stop-905103": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1217 21:12:00.620582  551650 profile.go:143] Saving config to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/scheduled-stop-905103/config.json ...
	I1217 21:12:00.620822  551650 mustload.go:66] Loading cluster: scheduled-stop-905103
	I1217 21:12:00.620988  551650 config.go:182] Loaded profile config "scheduled-stop-905103": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3

                                                
                                                
** /stderr **
scheduled_stop_test.go:204: (dbg) Run:  out/minikube-linux-arm64 status --format={{.TimeToStop}} -p scheduled-stop-905103 -n scheduled-stop-905103
scheduled_stop_test.go:172: signal error was:  <nil>
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-905103 --schedule 15s -v=5 --alsologtostderr
minikube stop output:

                                                
                                                
** stderr ** 
	I1217 21:12:01.071434  551743 out.go:360] Setting OutFile to fd 1 ...
	I1217 21:12:01.071640  551743 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 21:12:01.071676  551743 out.go:374] Setting ErrFile to fd 2...
	I1217 21:12:01.071714  551743 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 21:12:01.072204  551743 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 21:12:01.073428  551743 out.go:368] Setting JSON to false
	I1217 21:12:01.073664  551743 daemonize_unix.go:73] killing process 551669 as it is an old scheduled stop
	I1217 21:12:01.073752  551743 mustload.go:66] Loading cluster: scheduled-stop-905103
	I1217 21:12:01.074128  551743 config.go:182] Loaded profile config "scheduled-stop-905103": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1217 21:12:01.074202  551743 profile.go:143] Saving config to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/scheduled-stop-905103/config.json ...
	I1217 21:12:01.074373  551743 mustload.go:66] Loading cluster: scheduled-stop-905103
	I1217 21:12:01.074482  551743 config.go:182] Loaded profile config "scheduled-stop-905103": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3

                                                
                                                
** /stderr **
scheduled_stop_test.go:172: signal error was:  os: process already finished
I1217 21:12:01.084697  369461 retry.go:31] will retry after 92.265µs: open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/scheduled-stop-905103/pid: no such file or directory
I1217 21:12:01.085417  369461 retry.go:31] will retry after 86.976µs: open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/scheduled-stop-905103/pid: no such file or directory
I1217 21:12:01.086546  369461 retry.go:31] will retry after 201.47µs: open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/scheduled-stop-905103/pid: no such file or directory
I1217 21:12:01.087658  369461 retry.go:31] will retry after 378.391µs: open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/scheduled-stop-905103/pid: no such file or directory
I1217 21:12:01.088799  369461 retry.go:31] will retry after 552.07µs: open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/scheduled-stop-905103/pid: no such file or directory
I1217 21:12:01.089932  369461 retry.go:31] will retry after 762.435µs: open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/scheduled-stop-905103/pid: no such file or directory
I1217 21:12:01.091009  369461 retry.go:31] will retry after 1.377584ms: open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/scheduled-stop-905103/pid: no such file or directory
I1217 21:12:01.093220  369461 retry.go:31] will retry after 1.623235ms: open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/scheduled-stop-905103/pid: no such file or directory
I1217 21:12:01.095451  369461 retry.go:31] will retry after 3.721536ms: open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/scheduled-stop-905103/pid: no such file or directory
I1217 21:12:01.099755  369461 retry.go:31] will retry after 4.671752ms: open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/scheduled-stop-905103/pid: no such file or directory
I1217 21:12:01.104971  369461 retry.go:31] will retry after 2.952866ms: open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/scheduled-stop-905103/pid: no such file or directory
I1217 21:12:01.108191  369461 retry.go:31] will retry after 10.39913ms: open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/scheduled-stop-905103/pid: no such file or directory
I1217 21:12:01.119475  369461 retry.go:31] will retry after 19.109098ms: open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/scheduled-stop-905103/pid: no such file or directory
I1217 21:12:01.139712  369461 retry.go:31] will retry after 27.933154ms: open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/scheduled-stop-905103/pid: no such file or directory
I1217 21:12:01.167943  369461 retry.go:31] will retry after 39.66512ms: open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/scheduled-stop-905103/pid: no such file or directory
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-905103 --cancel-scheduled
minikube stop output:

                                                
                                                
-- stdout --
	* All existing scheduled stops cancelled

                                                
                                                
-- /stdout --
scheduled_stop_test.go:189: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p scheduled-stop-905103 -n scheduled-stop-905103
scheduled_stop_test.go:218: (dbg) Run:  out/minikube-linux-arm64 status -p scheduled-stop-905103
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-905103 --schedule 15s -v=5 --alsologtostderr
minikube stop output:

                                                
                                                
** stderr ** 
	I1217 21:12:27.032435  552433 out.go:360] Setting OutFile to fd 1 ...
	I1217 21:12:27.032569  552433 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 21:12:27.032580  552433 out.go:374] Setting ErrFile to fd 2...
	I1217 21:12:27.032586  552433 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 21:12:27.032841  552433 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 21:12:27.033116  552433 out.go:368] Setting JSON to false
	I1217 21:12:27.033270  552433 mustload.go:66] Loading cluster: scheduled-stop-905103
	I1217 21:12:27.033628  552433 config.go:182] Loaded profile config "scheduled-stop-905103": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1217 21:12:27.033710  552433 profile.go:143] Saving config to /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/scheduled-stop-905103/config.json ...
	I1217 21:12:27.033909  552433 mustload.go:66] Loading cluster: scheduled-stop-905103
	I1217 21:12:27.034034  552433 config.go:182] Loaded profile config "scheduled-stop-905103": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3

                                                
                                                
** /stderr **
scheduled_stop_test.go:172: signal error was:  os: process already finished
scheduled_stop_test.go:218: (dbg) Run:  out/minikube-linux-arm64 status -p scheduled-stop-905103
scheduled_stop_test.go:218: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p scheduled-stop-905103: exit status 7 (76.877104ms)

                                                
                                                
-- stdout --
	scheduled-stop-905103
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	

                                                
                                                
-- /stdout --
scheduled_stop_test.go:189: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p scheduled-stop-905103 -n scheduled-stop-905103
scheduled_stop_test.go:189: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p scheduled-stop-905103 -n scheduled-stop-905103: exit status 7 (73.969875ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
scheduled_stop_test.go:189: status error: exit status 7 (may be ok)
helpers_test.go:176: Cleaning up "scheduled-stop-905103" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p scheduled-stop-905103
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p scheduled-stop-905103: (4.2433731s)
--- PASS: TestScheduledStopUnix (105.04s)

                                                
                                    
x
+
TestInsufficientStorage (12.61s)

                                                
                                                
=== RUN   TestInsufficientStorage
status_test.go:50: (dbg) Run:  out/minikube-linux-arm64 start -p insufficient-storage-553437 --memory=3072 --output=json --wait=true --driver=docker  --container-runtime=containerd
status_test.go:50: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p insufficient-storage-553437 --memory=3072 --output=json --wait=true --driver=docker  --container-runtime=containerd: exit status 26 (10.058630916s)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"669c9fc8-6ad6-457c-ba14-ff69c4a0c620","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[insufficient-storage-553437] minikube v1.37.0 on Ubuntu 20.04 (arm64)","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"f6d8c826-72b6-4838-b06e-a97196b2cfbf","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=21808"}}
	{"specversion":"1.0","id":"5a21b3e8-7f8b-40b6-8aeb-e4011244018b","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"d0e88055-7174-48ee-88ff-d31861329209","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig"}}
	{"specversion":"1.0","id":"0fbec060-5428-4ed3-a984-4bbb6cf8af2f","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube"}}
	{"specversion":"1.0","id":"f27e189a-6f41-488a-8400-2d86fd9acbfa","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-linux-arm64"}}
	{"specversion":"1.0","id":"fc114280-7161-493e-83a7-0fa1606592af","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"446950d4-35ed-445c-94e0-1b6b02155ba2","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_TEST_STORAGE_CAPACITY=100"}}
	{"specversion":"1.0","id":"4cae3ae7-30bd-4452-9dbd-fc50e500c70d","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_TEST_AVAILABLE_STORAGE=19"}}
	{"specversion":"1.0","id":"5b948a93-acf6-4ec4-bfc1-f6ed0443a484","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"1","message":"Using the docker driver based on user configuration","name":"Selecting Driver","totalsteps":"19"}}
	{"specversion":"1.0","id":"b9530d5d-36b2-482b-a056-843569903897","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"Using Docker driver with root privileges"}}
	{"specversion":"1.0","id":"ec1d8c50-db02-4ee6-bc31-5fc76156895a","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"3","message":"Starting \"insufficient-storage-553437\" primary control-plane node in \"insufficient-storage-553437\" cluster","name":"Starting Node","totalsteps":"19"}}
	{"specversion":"1.0","id":"efdb2fab-9313-4ab9-aaba-a368ddc9ae4c","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"5","message":"Pulling base image v0.0.48-1765661130-22141 ...","name":"Pulling Base Image","totalsteps":"19"}}
	{"specversion":"1.0","id":"1d9ef141-d887-4bde-ab14-b04597d0c3b0","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"8","message":"Creating docker container (CPUs=2, Memory=3072MB) ...","name":"Creating Container","totalsteps":"19"}}
	{"specversion":"1.0","id":"acc65150-e28c-455a-abe1-55ef79deb5d8","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"Try one or more of the following to free up space on the device:\n\n\t\t\t1. Run \"docker system prune\" to remove unused Docker data (optionally with \"-a\")\n\t\t\t2. Increase the storage allocated to Docker for Desktop by clicking on:\n\t\t\t\tDocker icon \u003e Preferences \u003e Resources \u003e Disk Image Size\n\t\t\t3. Run \"minikube ssh -- docker system prune\" if using the Docker container runtime","exitcode":"26","issues":"https://github.com/kubernetes/minikube/issues/9024","message":"Docker is out of disk space! (/var is at 100% of capacity). You can pass '--force' to skip this check.","name":"RSRC_DOCKER_STORAGE","url":""}}

                                                
                                                
-- /stdout --
status_test.go:76: (dbg) Run:  out/minikube-linux-arm64 status -p insufficient-storage-553437 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p insufficient-storage-553437 --output=json --layout=cluster: exit status 7 (303.57745ms)

                                                
                                                
-- stdout --
	{"Name":"insufficient-storage-553437","StatusCode":507,"StatusName":"InsufficientStorage","StatusDetail":"/var is almost out of disk space","Step":"Creating Container","StepDetail":"Creating docker container (CPUs=2, Memory=3072MB) ...","BinaryVersion":"v1.37.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":500,"StatusName":"Error"}},"Nodes":[{"Name":"insufficient-storage-553437","StatusCode":507,"StatusName":"InsufficientStorage","Components":{"apiserver":{"Name":"apiserver","StatusCode":405,"StatusName":"Stopped"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
** stderr ** 
	E1217 21:13:26.784479  554253 status.go:458] kubeconfig endpoint: get endpoint: "insufficient-storage-553437" does not appear in /home/jenkins/minikube-integration/21808-367595/kubeconfig

                                                
                                                
** /stderr **
status_test.go:76: (dbg) Run:  out/minikube-linux-arm64 status -p insufficient-storage-553437 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p insufficient-storage-553437 --output=json --layout=cluster: exit status 7 (294.792067ms)

                                                
                                                
-- stdout --
	{"Name":"insufficient-storage-553437","StatusCode":507,"StatusName":"InsufficientStorage","StatusDetail":"/var is almost out of disk space","BinaryVersion":"v1.37.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":500,"StatusName":"Error"}},"Nodes":[{"Name":"insufficient-storage-553437","StatusCode":507,"StatusName":"InsufficientStorage","Components":{"apiserver":{"Name":"apiserver","StatusCode":405,"StatusName":"Stopped"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
** stderr ** 
	E1217 21:13:27.081335  554318 status.go:458] kubeconfig endpoint: get endpoint: "insufficient-storage-553437" does not appear in /home/jenkins/minikube-integration/21808-367595/kubeconfig
	E1217 21:13:27.091387  554318 status.go:258] unable to read event log: stat: stat /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/insufficient-storage-553437/events.json: no such file or directory

                                                
                                                
** /stderr **
helpers_test.go:176: Cleaning up "insufficient-storage-553437" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p insufficient-storage-553437
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p insufficient-storage-553437: (1.952112671s)
--- PASS: TestInsufficientStorage (12.61s)

                                                
                                    
x
+
TestRunningBinaryUpgrade (66.08s)

                                                
                                                
=== RUN   TestRunningBinaryUpgrade
=== PAUSE TestRunningBinaryUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:120: (dbg) Run:  /tmp/minikube-v1.35.0.3956973140 start -p running-upgrade-300562 --memory=3072 --vm-driver=docker  --container-runtime=containerd
version_upgrade_test.go:120: (dbg) Done: /tmp/minikube-v1.35.0.3956973140 start -p running-upgrade-300562 --memory=3072 --vm-driver=docker  --container-runtime=containerd: (33.666234953s)
version_upgrade_test.go:130: (dbg) Run:  out/minikube-linux-arm64 start -p running-upgrade-300562 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
version_upgrade_test.go:130: (dbg) Done: out/minikube-linux-arm64 start -p running-upgrade-300562 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (28.552414556s)
helpers_test.go:176: Cleaning up "running-upgrade-300562" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p running-upgrade-300562
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p running-upgrade-300562: (2.395113791s)
--- PASS: TestRunningBinaryUpgrade (66.08s)

                                                
                                    
x
+
TestMissingContainerUpgrade (132.85s)

                                                
                                                
=== RUN   TestMissingContainerUpgrade
=== PAUSE TestMissingContainerUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestMissingContainerUpgrade
version_upgrade_test.go:309: (dbg) Run:  /tmp/minikube-v1.35.0.2455242578 start -p missing-upgrade-335298 --memory=3072 --driver=docker  --container-runtime=containerd
version_upgrade_test.go:309: (dbg) Done: /tmp/minikube-v1.35.0.2455242578 start -p missing-upgrade-335298 --memory=3072 --driver=docker  --container-runtime=containerd: (1m3.818844412s)
version_upgrade_test.go:318: (dbg) Run:  docker stop missing-upgrade-335298
version_upgrade_test.go:323: (dbg) Run:  docker rm missing-upgrade-335298
version_upgrade_test.go:329: (dbg) Run:  out/minikube-linux-arm64 start -p missing-upgrade-335298 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
E1217 21:14:36.456481  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:329: (dbg) Done: out/minikube-linux-arm64 start -p missing-upgrade-335298 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (1m4.209429013s)
helpers_test.go:176: Cleaning up "missing-upgrade-335298" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p missing-upgrade-335298
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p missing-upgrade-335298: (2.59670728s)
--- PASS: TestMissingContainerUpgrade (132.85s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoK8sWithVersion (0.1s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoK8sWithVersion
no_kubernetes_test.go:108: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-639163 --no-kubernetes --kubernetes-version=v1.28.0 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:108: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p NoKubernetes-639163 --no-kubernetes --kubernetes-version=v1.28.0 --driver=docker  --container-runtime=containerd: exit status 14 (102.175607ms)

                                                
                                                
-- stdout --
	* [NoKubernetes-639163] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21808
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_USAGE: cannot specify --kubernetes-version with --no-kubernetes,
	to unset a global config run:
	
	$ minikube config unset kubernetes-version

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/StartNoK8sWithVersion (0.10s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithK8s (40.37s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:120: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-639163 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:120: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-639163 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (39.883120943s)
no_kubernetes_test.go:225: (dbg) Run:  out/minikube-linux-arm64 -p NoKubernetes-639163 status -o json
--- PASS: TestNoKubernetes/serial/StartWithK8s (40.37s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithStopK8s (19.78s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:137: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-639163 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:137: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-639163 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (16.791474965s)
no_kubernetes_test.go:225: (dbg) Run:  out/minikube-linux-arm64 -p NoKubernetes-639163 status -o json
no_kubernetes_test.go:225: (dbg) Non-zero exit: out/minikube-linux-arm64 -p NoKubernetes-639163 status -o json: exit status 2 (494.16573ms)

                                                
                                                
-- stdout --
	{"Name":"NoKubernetes-639163","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
no_kubernetes_test.go:149: (dbg) Run:  out/minikube-linux-arm64 delete -p NoKubernetes-639163
no_kubernetes_test.go:149: (dbg) Done: out/minikube-linux-arm64 delete -p NoKubernetes-639163: (2.498053403s)
--- PASS: TestNoKubernetes/serial/StartWithStopK8s (19.78s)

                                                
                                    
x
+
TestNoKubernetes/serial/Start (8.52s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Start
no_kubernetes_test.go:161: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-639163 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:161: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-639163 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (8.519862993s)
--- PASS: TestNoKubernetes/serial/Start (8.52s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads (0s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads
no_kubernetes_test.go:89: Checking cache directory: /home/jenkins/minikube-integration/21808-367595/.minikube/cache/linux/arm64/v0.0.0
--- PASS: TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads (0.00s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunning (0.29s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunning
no_kubernetes_test.go:172: (dbg) Run:  out/minikube-linux-arm64 ssh -p NoKubernetes-639163 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:172: (dbg) Non-zero exit: out/minikube-linux-arm64 ssh -p NoKubernetes-639163 "sudo systemctl is-active --quiet service kubelet": exit status 1 (293.394117ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunning (0.29s)

                                                
                                    
x
+
TestNoKubernetes/serial/ProfileList (0.7s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/ProfileList
no_kubernetes_test.go:194: (dbg) Run:  out/minikube-linux-arm64 profile list
no_kubernetes_test.go:204: (dbg) Run:  out/minikube-linux-arm64 profile list --output=json
--- PASS: TestNoKubernetes/serial/ProfileList (0.70s)

                                                
                                    
x
+
TestNoKubernetes/serial/Stop (1.3s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Stop
no_kubernetes_test.go:183: (dbg) Run:  out/minikube-linux-arm64 stop -p NoKubernetes-639163
no_kubernetes_test.go:183: (dbg) Done: out/minikube-linux-arm64 stop -p NoKubernetes-639163: (1.299782112s)
--- PASS: TestNoKubernetes/serial/Stop (1.30s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoArgs (6.78s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoArgs
no_kubernetes_test.go:216: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-639163 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:216: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-639163 --driver=docker  --container-runtime=containerd: (6.780930772s)
--- PASS: TestNoKubernetes/serial/StartNoArgs (6.78s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.28s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunningSecond
no_kubernetes_test.go:172: (dbg) Run:  out/minikube-linux-arm64 ssh -p NoKubernetes-639163 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:172: (dbg) Non-zero exit: out/minikube-linux-arm64 ssh -p NoKubernetes-639163 "sudo systemctl is-active --quiet service kubelet": exit status 1 (277.652395ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.28s)

                                                
                                    
x
+
TestNetworkPlugins/group/false (4.76s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false
net_test.go:246: (dbg) Run:  out/minikube-linux-arm64 start -p false-675779 --memory=3072 --alsologtostderr --cni=false --driver=docker  --container-runtime=containerd
net_test.go:246: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p false-675779 --memory=3072 --alsologtostderr --cni=false --driver=docker  --container-runtime=containerd: exit status 14 (188.450699ms)

                                                
                                                
-- stdout --
	* [false-675779] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21808
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on user configuration
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1217 21:14:52.403506  564488 out.go:360] Setting OutFile to fd 1 ...
	I1217 21:14:52.403621  564488 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 21:14:52.403631  564488 out.go:374] Setting ErrFile to fd 2...
	I1217 21:14:52.403637  564488 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1217 21:14:52.403981  564488 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21808-367595/.minikube/bin
	I1217 21:14:52.404498  564488 out.go:368] Setting JSON to false
	I1217 21:14:52.405348  564488 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":14238,"bootTime":1765991855,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1217 21:14:52.405442  564488 start.go:143] virtualization:  
	I1217 21:14:52.408906  564488 out.go:179] * [false-675779] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1217 21:14:52.412811  564488 out.go:179]   - MINIKUBE_LOCATION=21808
	I1217 21:14:52.413080  564488 notify.go:221] Checking for updates...
	I1217 21:14:52.418626  564488 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1217 21:14:52.421420  564488 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21808-367595/kubeconfig
	I1217 21:14:52.424321  564488 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21808-367595/.minikube
	I1217 21:14:52.427294  564488 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1217 21:14:52.430047  564488 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1217 21:14:52.433668  564488 config.go:182] Loaded profile config "missing-upgrade-335298": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.32.0
	I1217 21:14:52.433797  564488 driver.go:422] Setting default libvirt URI to qemu:///system
	I1217 21:14:52.466291  564488 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1217 21:14:52.466422  564488 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1217 21:14:52.525174  564488 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:25 OomKillDisable:true NGoroutines:43 SystemTime:2025-12-17 21:14:52.513595918 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1217 21:14:52.525290  564488 docker.go:319] overlay module found
	I1217 21:14:52.528551  564488 out.go:179] * Using the docker driver based on user configuration
	I1217 21:14:52.531431  564488 start.go:309] selected driver: docker
	I1217 21:14:52.531450  564488 start.go:927] validating driver "docker" against <nil>
	I1217 21:14:52.531465  564488 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1217 21:14:52.534965  564488 out.go:203] 
	W1217 21:14:52.537780  564488 out.go:285] X Exiting due to MK_USAGE: The "containerd" container runtime requires CNI
	X Exiting due to MK_USAGE: The "containerd" container runtime requires CNI
	I1217 21:14:52.540679  564488 out.go:203] 

                                                
                                                
** /stderr **
net_test.go:88: 
----------------------- debugLogs start: false-675779 [pass: true] --------------------------------
>>> netcat: nslookup kubernetes.default:
Error in configuration: context was not found for specified context: false-675779

                                                
                                                

                                                
                                                
>>> netcat: nslookup debug kubernetes.default a-records:
Error in configuration: context was not found for specified context: false-675779

                                                
                                                

                                                
                                                
>>> netcat: dig search kubernetes.default:
Error in configuration: context was not found for specified context: false-675779

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local udp/53:
Error in configuration: context was not found for specified context: false-675779

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local tcp/53:
Error in configuration: context was not found for specified context: false-675779

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 udp/53:
Error in configuration: context was not found for specified context: false-675779

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 tcp/53:
Error in configuration: context was not found for specified context: false-675779

                                                
                                                

                                                
                                                
>>> netcat: /etc/nsswitch.conf:
Error in configuration: context was not found for specified context: false-675779

                                                
                                                

                                                
                                                
>>> netcat: /etc/hosts:
Error in configuration: context was not found for specified context: false-675779

                                                
                                                

                                                
                                                
>>> netcat: /etc/resolv.conf:
Error in configuration: context was not found for specified context: false-675779

                                                
                                                

                                                
                                                
>>> host: /etc/nsswitch.conf:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/hosts:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/resolv.conf:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> k8s: nodes, services, endpoints, daemon sets, deployments and pods, :
Error in configuration: context was not found for specified context: false-675779

                                                
                                                

                                                
                                                
>>> host: crictl pods:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: crictl containers:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> k8s: describe netcat deployment:
error: context "false-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe netcat pod(s):
error: context "false-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: netcat logs:
error: context "false-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns deployment:
error: context "false-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns pods:
error: context "false-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: coredns logs:
error: context "false-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe api server pod(s):
error: context "false-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: api server logs:
error: context "false-675779" does not exist

                                                
                                                

                                                
                                                
>>> host: /etc/cni:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: ip a s:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: ip r s:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: iptables-save:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: iptables table nat:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy daemon set:
error: context "false-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy pod(s):
error: context "false-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: kube-proxy logs:
error: context "false-675779" does not exist

                                                
                                                

                                                
                                                
>>> host: kubelet daemon status:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: kubelet daemon config:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> k8s: kubelet logs:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/kubernetes/kubelet.conf:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: /var/lib/kubelet/config.yaml:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> k8s: kubectl config:
apiVersion: v1
clusters:
- cluster:
certificate-authority: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt
extensions:
- extension:
last-update: Wed, 17 Dec 2025 21:14:32 UTC
provider: minikube.sigs.k8s.io
version: v1.35.0
name: cluster_info
server: https://192.168.85.2:8443
name: missing-upgrade-335298
contexts:
- context:
cluster: missing-upgrade-335298
extensions:
- extension:
last-update: Wed, 17 Dec 2025 21:14:32 UTC
provider: minikube.sigs.k8s.io
version: v1.35.0
name: context_info
namespace: default
user: missing-upgrade-335298
name: missing-upgrade-335298
current-context: missing-upgrade-335298
kind: Config
preferences: {}
users:
- name: missing-upgrade-335298
user:
client-certificate: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/missing-upgrade-335298/client.crt
client-key: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/missing-upgrade-335298/client.key

                                                
                                                

                                                
                                                
>>> k8s: cms:
Error in configuration: context was not found for specified context: false-675779

                                                
                                                

                                                
                                                
>>> host: docker daemon status:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: docker daemon config:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/docker/daemon.json:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: docker system info:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon status:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon config:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/systemd/system/cri-docker.service.d/10-cni.conf:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: /usr/lib/systemd/system/cri-docker.service:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: cri-dockerd version:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: containerd daemon status:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: containerd daemon config:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: /lib/systemd/system/containerd.service:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/containerd/config.toml:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: containerd config dump:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: crio daemon status:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: crio daemon config:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/crio:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                

                                                
                                                
>>> host: crio config:
* Profile "false-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-675779"

                                                
                                                
----------------------- debugLogs end: false-675779 [took: 4.36109439s] --------------------------------
helpers_test.go:176: Cleaning up "false-675779" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p false-675779
--- PASS: TestNetworkPlugins/group/false (4.76s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Setup (1.26s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Setup
--- PASS: TestStoppedBinaryUpgrade/Setup (1.26s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Upgrade (304.9s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:183: (dbg) Run:  /tmp/minikube-v1.35.0.2618640595 start -p stopped-upgrade-305425 --memory=3072 --vm-driver=docker  --container-runtime=containerd
E1217 21:19:49.015352  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:183: (dbg) Done: /tmp/minikube-v1.35.0.2618640595 start -p stopped-upgrade-305425 --memory=3072 --vm-driver=docker  --container-runtime=containerd: (33.325604754s)
version_upgrade_test.go:192: (dbg) Run:  /tmp/minikube-v1.35.0.2618640595 -p stopped-upgrade-305425 stop
version_upgrade_test.go:192: (dbg) Done: /tmp/minikube-v1.35.0.2618640595 -p stopped-upgrade-305425 stop: (1.290750786s)
version_upgrade_test.go:198: (dbg) Run:  out/minikube-linux-arm64 start -p stopped-upgrade-305425 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
E1217 21:21:12.096904  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 21:21:28.507544  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 21:22:39.526393  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 21:24:36.456412  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:198: (dbg) Done: out/minikube-linux-arm64 start -p stopped-upgrade-305425 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (4m30.284433179s)
--- PASS: TestStoppedBinaryUpgrade/Upgrade (304.90s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/MinikubeLogs (2.14s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/MinikubeLogs
version_upgrade_test.go:206: (dbg) Run:  out/minikube-linux-arm64 logs -p stopped-upgrade-305425
version_upgrade_test.go:206: (dbg) Done: out/minikube-linux-arm64 logs -p stopped-upgrade-305425: (2.143217798s)
--- PASS: TestStoppedBinaryUpgrade/MinikubeLogs (2.14s)

                                                
                                    
x
+
TestPause/serial/Start (51.23s)

                                                
                                                
=== RUN   TestPause/serial/Start
pause_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -p pause-328041 --memory=3072 --install-addons=false --wait=all --driver=docker  --container-runtime=containerd
E1217 21:24:49.014546  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
pause_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -p pause-328041 --memory=3072 --install-addons=false --wait=all --driver=docker  --container-runtime=containerd: (51.227777604s)
--- PASS: TestPause/serial/Start (51.23s)

                                                
                                    
x
+
TestPause/serial/SecondStartNoReconfiguration (6.45s)

                                                
                                                
=== RUN   TestPause/serial/SecondStartNoReconfiguration
pause_test.go:92: (dbg) Run:  out/minikube-linux-arm64 start -p pause-328041 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
pause_test.go:92: (dbg) Done: out/minikube-linux-arm64 start -p pause-328041 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (6.439317316s)
--- PASS: TestPause/serial/SecondStartNoReconfiguration (6.45s)

                                                
                                    
x
+
TestPause/serial/Pause (0.71s)

                                                
                                                
=== RUN   TestPause/serial/Pause
pause_test.go:110: (dbg) Run:  out/minikube-linux-arm64 pause -p pause-328041 --alsologtostderr -v=5
--- PASS: TestPause/serial/Pause (0.71s)

                                                
                                    
x
+
TestPause/serial/VerifyStatus (0.36s)

                                                
                                                
=== RUN   TestPause/serial/VerifyStatus
status_test.go:76: (dbg) Run:  out/minikube-linux-arm64 status -p pause-328041 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p pause-328041 --output=json --layout=cluster: exit status 2 (363.973329ms)

                                                
                                                
-- stdout --
	{"Name":"pause-328041","StatusCode":418,"StatusName":"Paused","Step":"Done","StepDetail":"* Paused 7 containers in: kube-system, kubernetes-dashboard, istio-operator","BinaryVersion":"v1.37.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":200,"StatusName":"OK"}},"Nodes":[{"Name":"pause-328041","StatusCode":200,"StatusName":"OK","Components":{"apiserver":{"Name":"apiserver","StatusCode":418,"StatusName":"Paused"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
--- PASS: TestPause/serial/VerifyStatus (0.36s)

                                                
                                    
x
+
TestPause/serial/Unpause (0.67s)

                                                
                                                
=== RUN   TestPause/serial/Unpause
pause_test.go:121: (dbg) Run:  out/minikube-linux-arm64 unpause -p pause-328041 --alsologtostderr -v=5
--- PASS: TestPause/serial/Unpause (0.67s)

                                                
                                    
x
+
TestPause/serial/PauseAgain (0.86s)

                                                
                                                
=== RUN   TestPause/serial/PauseAgain
pause_test.go:110: (dbg) Run:  out/minikube-linux-arm64 pause -p pause-328041 --alsologtostderr -v=5
--- PASS: TestPause/serial/PauseAgain (0.86s)

                                                
                                    
x
+
TestPause/serial/DeletePaused (2.86s)

                                                
                                                
=== RUN   TestPause/serial/DeletePaused
pause_test.go:132: (dbg) Run:  out/minikube-linux-arm64 delete -p pause-328041 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-linux-arm64 delete -p pause-328041 --alsologtostderr -v=5: (2.857939811s)
--- PASS: TestPause/serial/DeletePaused (2.86s)

                                                
                                    
x
+
TestPause/serial/VerifyDeletedResources (0.41s)

                                                
                                                
=== RUN   TestPause/serial/VerifyDeletedResources
pause_test.go:142: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
pause_test.go:168: (dbg) Run:  docker ps -a
pause_test.go:173: (dbg) Run:  docker volume inspect pause-328041
pause_test.go:173: (dbg) Non-zero exit: docker volume inspect pause-328041: exit status 1 (23.607073ms)

                                                
                                                
-- stdout --
	[]

                                                
                                                
-- /stdout --
** stderr ** 
	Error response from daemon: get pause-328041: no such volume

                                                
                                                
** /stderr **
pause_test.go:178: (dbg) Run:  docker network ls
--- PASS: TestPause/serial/VerifyDeletedResources (0.41s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Start (53.26s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-arm64 start -p auto-675779 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --driver=docker  --container-runtime=containerd
E1217 21:26:28.507103  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
net_test.go:112: (dbg) Done: out/minikube-linux-arm64 start -p auto-675779 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --driver=docker  --container-runtime=containerd: (53.2617729s)
--- PASS: TestNetworkPlugins/group/auto/Start (53.26s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/KubeletFlags (0.33s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-arm64 ssh -p auto-675779 "pgrep -a kubelet"
I1217 21:26:45.646212  369461 config.go:182] Loaded profile config "auto-675779": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
--- PASS: TestNetworkPlugins/group/auto/KubeletFlags (0.33s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/NetCatPod (10.28s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context auto-675779 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/auto/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:353: "netcat-cd4db9dbf-299fv" [d671b099-5c35-4591-bc6d-4d096f9d3157] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:353: "netcat-cd4db9dbf-299fv" [d671b099-5c35-4591-bc6d-4d096f9d3157] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/auto/NetCatPod: app=netcat healthy within 10.002803567s
--- PASS: TestNetworkPlugins/group/auto/NetCatPod (10.28s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/DNS (0.2s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/DNS
net_test.go:175: (dbg) Run:  kubectl --context auto-675779 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/auto/DNS (0.20s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Localhost (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Localhost
net_test.go:194: (dbg) Run:  kubectl --context auto-675779 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/auto/Localhost (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/HairPin (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/HairPin
net_test.go:264: (dbg) Run:  kubectl --context auto-675779 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/auto/HairPin (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Start (46.74s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-arm64 start -p kindnet-675779 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=kindnet --driver=docker  --container-runtime=containerd
E1217 21:27:51.578194  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-032730/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
net_test.go:112: (dbg) Done: out/minikube-linux-arm64 start -p kindnet-675779 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=kindnet --driver=docker  --container-runtime=containerd: (46.74194347s)
--- PASS: TestNetworkPlugins/group/kindnet/Start (46.74s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/ControllerPod (6.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: waiting 10m0s for pods matching "app=kindnet" in namespace "kube-system" ...
helpers_test.go:353: "kindnet-zkvs2" [101e1674-2066-463f-8366-a12e8b74b2b5] Running
net_test.go:120: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: app=kindnet healthy within 6.003686284s
--- PASS: TestNetworkPlugins/group/kindnet/ControllerPod (6.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/KubeletFlags (0.31s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-arm64 ssh -p kindnet-675779 "pgrep -a kubelet"
I1217 21:28:10.190083  369461 config.go:182] Loaded profile config "kindnet-675779": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
--- PASS: TestNetworkPlugins/group/kindnet/KubeletFlags (0.31s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/NetCatPod (10.25s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context kindnet-675779 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:353: "netcat-cd4db9dbf-pvwxf" [534928d1-31b3-4056-aaff-828c1f2ad43a] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:353: "netcat-cd4db9dbf-pvwxf" [534928d1-31b3-4056-aaff-828c1f2ad43a] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: app=netcat healthy within 10.003163597s
--- PASS: TestNetworkPlugins/group/kindnet/NetCatPod (10.25s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/DNS (0.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/DNS
net_test.go:175: (dbg) Run:  kubectl --context kindnet-675779 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/kindnet/DNS (0.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Localhost (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Localhost
net_test.go:194: (dbg) Run:  kubectl --context kindnet-675779 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/kindnet/Localhost (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/HairPin (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/HairPin
net_test.go:264: (dbg) Run:  kubectl --context kindnet-675779 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/kindnet/HairPin (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Start (80.33s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-arm64 start -p calico-675779 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=calico --driver=docker  --container-runtime=containerd
E1217 21:29:36.456441  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/functional-682596/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 21:29:49.014713  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
net_test.go:112: (dbg) Done: out/minikube-linux-arm64 start -p calico-675779 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=calico --driver=docker  --container-runtime=containerd: (1m20.334441362s)
--- PASS: TestNetworkPlugins/group/calico/Start (80.33s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/ControllerPod (6.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/calico/ControllerPod: waiting 10m0s for pods matching "k8s-app=calico-node" in namespace "kube-system" ...
helpers_test.go:353: "calico-node-7tw9x" [26ecdf82-5f5b-423e-ae5b-1fd661cbe7c2] Running
net_test.go:120: (dbg) TestNetworkPlugins/group/calico/ControllerPod: k8s-app=calico-node healthy within 6.003789797s
--- PASS: TestNetworkPlugins/group/calico/ControllerPod (6.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/KubeletFlags (0.32s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-arm64 ssh -p calico-675779 "pgrep -a kubelet"
I1217 21:30:08.360838  369461 config.go:182] Loaded profile config "calico-675779": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
--- PASS: TestNetworkPlugins/group/calico/KubeletFlags (0.32s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/NetCatPod (8.26s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context calico-675779 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/calico/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:353: "netcat-cd4db9dbf-ckt84" [7e43476a-17a6-441d-96d3-23d3ff2f3a94] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:353: "netcat-cd4db9dbf-ckt84" [7e43476a-17a6-441d-96d3-23d3ff2f3a94] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/calico/NetCatPod: app=netcat healthy within 8.005006864s
--- PASS: TestNetworkPlugins/group/calico/NetCatPod (8.26s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/DNS (0.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/DNS
net_test.go:175: (dbg) Run:  kubectl --context calico-675779 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/calico/DNS (0.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Localhost (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Localhost
net_test.go:194: (dbg) Run:  kubectl --context calico-675779 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/calico/Localhost (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/HairPin (0.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/HairPin
net_test.go:264: (dbg) Run:  kubectl --context calico-675779 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/calico/HairPin (0.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Start (57.84s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-arm64 start -p custom-flannel-675779 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=testdata/kube-flannel.yaml --driver=docker  --container-runtime=containerd
net_test.go:112: (dbg) Done: out/minikube-linux-arm64 start -p custom-flannel-675779 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=testdata/kube-flannel.yaml --driver=docker  --container-runtime=containerd: (57.842991879s)
--- PASS: TestNetworkPlugins/group/custom-flannel/Start (57.84s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Start (82.33s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-arm64 start -p enable-default-cni-675779 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --enable-default-cni=true --driver=docker  --container-runtime=containerd
net_test.go:112: (dbg) Done: out/minikube-linux-arm64 start -p enable-default-cni-675779 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --enable-default-cni=true --driver=docker  --container-runtime=containerd: (1m22.332064983s)
--- PASS: TestNetworkPlugins/group/enable-default-cni/Start (82.33s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.31s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-arm64 ssh -p custom-flannel-675779 "pgrep -a kubelet"
I1217 21:31:37.212486  369461 config.go:182] Loaded profile config "custom-flannel-675779": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
--- PASS: TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.31s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/NetCatPod (9.26s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context custom-flannel-675779 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:353: "netcat-cd4db9dbf-78wbd" [1213ca60-6b86-4ec4-ade3-42e5593d7245] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:353: "netcat-cd4db9dbf-78wbd" [1213ca60-6b86-4ec4-ade3-42e5593d7245] Running
E1217 21:31:45.902680  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/auto-675779/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 21:31:45.909105  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/auto-675779/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 21:31:45.920514  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/auto-675779/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 21:31:45.942623  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/auto-675779/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 21:31:45.984046  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/auto-675779/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 21:31:46.065700  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/auto-675779/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1217 21:31:46.227155  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/auto-675779/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
net_test.go:163: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: app=netcat healthy within 9.004719984s
--- PASS: TestNetworkPlugins/group/custom-flannel/NetCatPod (9.26s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/DNS (0.23s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/DNS
net_test.go:175: (dbg) Run:  kubectl --context custom-flannel-675779 exec deployment/netcat -- nslookup kubernetes.default
E1217 21:31:46.548916  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/auto-675779/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
--- PASS: TestNetworkPlugins/group/custom-flannel/DNS (0.23s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Localhost (0.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Localhost
net_test.go:194: (dbg) Run:  kubectl --context custom-flannel-675779 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/Localhost (0.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/HairPin (0.2s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/HairPin
net_test.go:264: (dbg) Run:  kubectl --context custom-flannel-675779 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/HairPin (0.20s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Start (57.64s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-arm64 start -p flannel-675779 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=flannel --driver=docker  --container-runtime=containerd
E1217 21:32:26.877870  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/auto-675779/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
net_test.go:112: (dbg) Done: out/minikube-linux-arm64 start -p flannel-675779 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=flannel --driver=docker  --container-runtime=containerd: (57.64056766s)
--- PASS: TestNetworkPlugins/group/flannel/Start (57.64s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.45s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-arm64 ssh -p enable-default-cni-675779 "pgrep -a kubelet"
I1217 21:32:51.434387  369461 config.go:182] Loaded profile config "enable-default-cni-675779": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
--- PASS: TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.45s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/NetCatPod (9.38s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context enable-default-cni-675779 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:353: "netcat-cd4db9dbf-hdmsr" [f80e8fee-2958-4080-8734-bfe89a75209b] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:353: "netcat-cd4db9dbf-hdmsr" [f80e8fee-2958-4080-8734-bfe89a75209b] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: app=netcat healthy within 9.004144785s
--- PASS: TestNetworkPlugins/group/enable-default-cni/NetCatPod (9.38s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/DNS (0.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/DNS
net_test.go:175: (dbg) Run:  kubectl --context enable-default-cni-675779 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/enable-default-cni/DNS (0.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Localhost (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Localhost
net_test.go:194: (dbg) Run:  kubectl --context enable-default-cni-675779 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/Localhost (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/HairPin (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/HairPin
net_test.go:264: (dbg) Run:  kubectl --context enable-default-cni-675779 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/HairPin (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/ControllerPod (6.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: waiting 10m0s for pods matching "app=flannel" in namespace "kube-flannel" ...
helpers_test.go:353: "kube-flannel-ds-28dzf" [7dd2d005-0ec5-4a9d-b5d9-d02458762b23] Running
E1217 21:33:14.126234  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/kindnet-675779/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
net_test.go:120: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: app=flannel healthy within 6.003984435s
--- PASS: TestNetworkPlugins/group/flannel/ControllerPod (6.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/KubeletFlags (0.4s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-arm64 ssh -p flannel-675779 "pgrep -a kubelet"
I1217 21:33:17.952751  369461 config.go:182] Loaded profile config "flannel-675779": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
--- PASS: TestNetworkPlugins/group/flannel/KubeletFlags (0.40s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/NetCatPod (10.36s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context flannel-675779 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:353: "netcat-cd4db9dbf-b4f4z" [b0873586-daa4-44c0-bbdc-5fbc8f1aac9c] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:353: "netcat-cd4db9dbf-b4f4z" [b0873586-daa4-44c0-bbdc-5fbc8f1aac9c] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: app=netcat healthy within 10.004583294s
--- PASS: TestNetworkPlugins/group/flannel/NetCatPod (10.36s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Start (77.02s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-arm64 start -p bridge-675779 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=bridge --driver=docker  --container-runtime=containerd
E1217 21:33:24.367766  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/kindnet-675779/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
net_test.go:112: (dbg) Done: out/minikube-linux-arm64 start -p bridge-675779 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=bridge --driver=docker  --container-runtime=containerd: (1m17.019059179s)
--- PASS: TestNetworkPlugins/group/bridge/Start (77.02s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/DNS (0.36s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/DNS
net_test.go:175: (dbg) Run:  kubectl --context flannel-675779 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/flannel/DNS (0.36s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Localhost (0.31s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Localhost
net_test.go:194: (dbg) Run:  kubectl --context flannel-675779 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/flannel/Localhost (0.31s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/HairPin (0.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/HairPin
net_test.go:264: (dbg) Run:  kubectl --context flannel-675779 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/flannel/HairPin (0.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/KubeletFlags (0.38s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-arm64 ssh -p bridge-675779 "pgrep -a kubelet"
I1217 21:34:40.238239  369461 config.go:182] Loaded profile config "bridge-675779": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
--- PASS: TestNetworkPlugins/group/bridge/KubeletFlags (0.38s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/NetCatPod (11.38s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context bridge-675779 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:353: "netcat-cd4db9dbf-fj48g" [19bcd946-faf5-4160-92f0-c037d4d6e955] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:353: "netcat-cd4db9dbf-fj48g" [19bcd946-faf5-4160-92f0-c037d4d6e955] Running
E1217 21:34:49.015088  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
net_test.go:163: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: app=netcat healthy within 11.005002367s
--- PASS: TestNetworkPlugins/group/bridge/NetCatPod (11.38s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/DNS (0.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/DNS
net_test.go:175: (dbg) Run:  kubectl --context bridge-675779 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/bridge/DNS (0.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Localhost (0.2s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Localhost
net_test.go:194: (dbg) Run:  kubectl --context bridge-675779 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/bridge/Localhost (0.20s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/HairPin (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/HairPin
net_test.go:264: (dbg) Run:  kubectl --context bridge-675779 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/bridge/HairPin (0.16s)

                                                
                                    

Test skip (37/369)

Order skiped test Duration
5 TestDownloadOnly/v1.28.0/cached-images 0
6 TestDownloadOnly/v1.28.0/binaries 0
7 TestDownloadOnly/v1.28.0/kubectl 0
14 TestDownloadOnly/v1.34.3/cached-images 0
15 TestDownloadOnly/v1.34.3/binaries 0
16 TestDownloadOnly/v1.34.3/kubectl 0
23 TestDownloadOnly/v1.35.0-rc.1/cached-images 0
24 TestDownloadOnly/v1.35.0-rc.1/binaries 0
25 TestDownloadOnly/v1.35.0-rc.1/kubectl 0
29 TestDownloadOnlyKic 0.43
31 TestOffline 0
42 TestAddons/serial/GCPAuth/RealCredentials 0
49 TestAddons/parallel/Olm 0
56 TestAddons/parallel/AmdGpuDevicePlugin 0
60 TestDockerFlags 0
64 TestHyperKitDriverInstallOrUpdate 0
65 TestHyperkitDriverSkipUpgrade 0
112 TestFunctional/parallel/MySQL 0
116 TestFunctional/parallel/DockerEnv 0
117 TestFunctional/parallel/PodmanEnv 0
130 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig 0
131 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0
132 TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS 0
207 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MySQL 0
211 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DockerEnv 0
212 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PodmanEnv 0
248 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDig 0
249 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0
250 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessThroughDNS 0
261 TestGvisorAddon 0
283 TestImageBuild 0
284 TestISOImage 0
348 TestChangeNoneUser 0
351 TestScheduledStopWindows 0
353 TestSkaffold 0
373 TestNetworkPlugins/group/kubenet 3.49
381 TestNetworkPlugins/group/cilium 5.4
x
+
TestDownloadOnly/v1.28.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/cached-images
aaa_download_only_test.go:128: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.28.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/binaries
aaa_download_only_test.go:150: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.28.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/kubectl
aaa_download_only_test.go:166: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.28.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/cached-images
aaa_download_only_test.go:128: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.34.3/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/binaries
aaa_download_only_test.go:150: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.34.3/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/kubectl
aaa_download_only_test.go:166: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.34.3/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/cached-images
aaa_download_only_test.go:128: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.35.0-rc.1/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/binaries
aaa_download_only_test.go:150: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.35.0-rc.1/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/kubectl
aaa_download_only_test.go:166: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.35.0-rc.1/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnlyKic (0.43s)

                                                
                                                
=== RUN   TestDownloadOnlyKic
aaa_download_only_test.go:231: (dbg) Run:  out/minikube-linux-arm64 start --download-only -p download-docker-602911 --alsologtostderr --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:248: Skip for arm64 platform. See https://github.com/kubernetes/minikube/issues/10144
helpers_test.go:176: Cleaning up "download-docker-602911" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p download-docker-602911
--- SKIP: TestDownloadOnlyKic (0.43s)

                                                
                                    
x
+
TestOffline (0s)

                                                
                                                
=== RUN   TestOffline
=== PAUSE TestOffline

                                                
                                                

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:35: skipping TestOffline - only docker runtime supported on arm64. See https://github.com/kubernetes/minikube/issues/10144
--- SKIP: TestOffline (0.00s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/RealCredentials (0s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/RealCredentials
addons_test.go:761: This test requires a GCE instance (excluding Cloud Shell) with a container based driver
--- SKIP: TestAddons/serial/GCPAuth/RealCredentials (0.00s)

                                                
                                    
x
+
TestAddons/parallel/Olm (0s)

                                                
                                                
=== RUN   TestAddons/parallel/Olm
=== PAUSE TestAddons/parallel/Olm

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Olm
addons_test.go:485: Skipping OLM addon test until https://github.com/operator-framework/operator-lifecycle-manager/issues/2534 is resolved
--- SKIP: TestAddons/parallel/Olm (0.00s)

                                                
                                    
x
+
TestAddons/parallel/AmdGpuDevicePlugin (0s)

                                                
                                                
=== RUN   TestAddons/parallel/AmdGpuDevicePlugin
=== PAUSE TestAddons/parallel/AmdGpuDevicePlugin

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/AmdGpuDevicePlugin
addons_test.go:1035: skip amd gpu test on all but docker driver and amd64 platform
--- SKIP: TestAddons/parallel/AmdGpuDevicePlugin (0.00s)

                                                
                                    
x
+
TestDockerFlags (0s)

                                                
                                                
=== RUN   TestDockerFlags
docker_test.go:41: skipping: only runs with docker container runtime, currently testing containerd
--- SKIP: TestDockerFlags (0.00s)

                                                
                                    
x
+
TestHyperKitDriverInstallOrUpdate (0s)

                                                
                                                
=== RUN   TestHyperKitDriverInstallOrUpdate
driver_install_or_update_test.go:37: Skip if not darwin.
--- SKIP: TestHyperKitDriverInstallOrUpdate (0.00s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade (0s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade
driver_install_or_update_test.go:101: Skip if not darwin.
--- SKIP: TestHyperkitDriverSkipUpgrade (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/MySQL (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/MySQL
=== PAUSE TestFunctional/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1792: arm64 is not supported by mysql. Skip the test. See https://github.com/kubernetes/minikube/issues/10144
--- SKIP: TestFunctional/parallel/MySQL (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/DockerEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/DockerEnv
=== PAUSE TestFunctional/parallel/DockerEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DockerEnv
functional_test.go:478: only validate docker env with docker container runtime, currently testing containerd
--- SKIP: TestFunctional/parallel/DockerEnv (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/PodmanEnv
=== PAUSE TestFunctional/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PodmanEnv
functional_test.go:565: only validate podman env with docker container runtime, currently testing containerd
--- SKIP: TestFunctional/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MySQL (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MySQL
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MySQL
functional_test.go:1792: arm64 is not supported by mysql. Skip the test. See https://github.com/kubernetes/minikube/issues/10144
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MySQL (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DockerEnv (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DockerEnv
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DockerEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DockerEnv
functional_test.go:478: only validate docker env with docker container runtime, currently testing containerd
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DockerEnv (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PodmanEnv
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PodmanEnv
functional_test.go:565: only validate podman env with docker container runtime, currently testing containerd
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDig (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDig (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessThroughDNS (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessThroughDNS (0.00s)

                                                
                                    
x
+
TestGvisorAddon (0s)

                                                
                                                
=== RUN   TestGvisorAddon
gvisor_addon_test.go:34: skipping test because --gvisor=false
--- SKIP: TestGvisorAddon (0.00s)

                                                
                                    
x
+
TestImageBuild (0s)

                                                
                                                
=== RUN   TestImageBuild
image_test.go:33: 
--- SKIP: TestImageBuild (0.00s)

                                                
                                    
x
+
TestISOImage (0s)

                                                
                                                
=== RUN   TestISOImage
iso_test.go:36: This test requires a VM driver
--- SKIP: TestISOImage (0.00s)

                                                
                                    
x
+
TestChangeNoneUser (0s)

                                                
                                                
=== RUN   TestChangeNoneUser
none_test.go:38: Test requires none driver and SUDO_USER env to not be empty
--- SKIP: TestChangeNoneUser (0.00s)

                                                
                                    
x
+
TestScheduledStopWindows (0s)

                                                
                                                
=== RUN   TestScheduledStopWindows
scheduled_stop_test.go:42: test only runs on windows
--- SKIP: TestScheduledStopWindows (0.00s)

                                                
                                    
x
+
TestSkaffold (0s)

                                                
                                                
=== RUN   TestSkaffold
skaffold_test.go:45: skaffold requires docker-env, currently testing containerd container runtime
--- SKIP: TestSkaffold (0.00s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet (3.49s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet
net_test.go:93: Skipping the test as containerd container runtimes requires CNI
E1217 21:14:49.014793  369461 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/addons-060437/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
panic.go:615: 
----------------------- debugLogs start: kubenet-675779 [pass: true] --------------------------------
>>> netcat: nslookup kubernetes.default:
Error in configuration: context was not found for specified context: kubenet-675779

                                                
                                                

                                                
                                                
>>> netcat: nslookup debug kubernetes.default a-records:
Error in configuration: context was not found for specified context: kubenet-675779

                                                
                                                

                                                
                                                
>>> netcat: dig search kubernetes.default:
Error in configuration: context was not found for specified context: kubenet-675779

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local udp/53:
Error in configuration: context was not found for specified context: kubenet-675779

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local tcp/53:
Error in configuration: context was not found for specified context: kubenet-675779

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 udp/53:
Error in configuration: context was not found for specified context: kubenet-675779

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 tcp/53:
Error in configuration: context was not found for specified context: kubenet-675779

                                                
                                                

                                                
                                                
>>> netcat: /etc/nsswitch.conf:
Error in configuration: context was not found for specified context: kubenet-675779

                                                
                                                

                                                
                                                
>>> netcat: /etc/hosts:
Error in configuration: context was not found for specified context: kubenet-675779

                                                
                                                

                                                
                                                
>>> netcat: /etc/resolv.conf:
Error in configuration: context was not found for specified context: kubenet-675779

                                                
                                                

                                                
                                                
>>> host: /etc/nsswitch.conf:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/hosts:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/resolv.conf:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> k8s: nodes, services, endpoints, daemon sets, deployments and pods, :
Error in configuration: context was not found for specified context: kubenet-675779

                                                
                                                

                                                
                                                
>>> host: crictl pods:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: crictl containers:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> k8s: describe netcat deployment:
error: context "kubenet-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe netcat pod(s):
error: context "kubenet-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: netcat logs:
error: context "kubenet-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns deployment:
error: context "kubenet-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns pods:
error: context "kubenet-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: coredns logs:
error: context "kubenet-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe api server pod(s):
error: context "kubenet-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: api server logs:
error: context "kubenet-675779" does not exist

                                                
                                                

                                                
                                                
>>> host: /etc/cni:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: ip a s:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: ip r s:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: iptables-save:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: iptables table nat:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy daemon set:
error: context "kubenet-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy pod(s):
error: context "kubenet-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: kube-proxy logs:
error: context "kubenet-675779" does not exist

                                                
                                                

                                                
                                                
>>> host: kubelet daemon status:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: kubelet daemon config:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> k8s: kubelet logs:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/kubernetes/kubelet.conf:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: /var/lib/kubelet/config.yaml:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> k8s: kubectl config:
apiVersion: v1
clusters:
- cluster:
certificate-authority: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt
extensions:
- extension:
last-update: Wed, 17 Dec 2025 21:14:32 UTC
provider: minikube.sigs.k8s.io
version: v1.35.0
name: cluster_info
server: https://192.168.85.2:8443
name: missing-upgrade-335298
contexts:
- context:
cluster: missing-upgrade-335298
extensions:
- extension:
last-update: Wed, 17 Dec 2025 21:14:32 UTC
provider: minikube.sigs.k8s.io
version: v1.35.0
name: context_info
namespace: default
user: missing-upgrade-335298
name: missing-upgrade-335298
current-context: missing-upgrade-335298
kind: Config
preferences: {}
users:
- name: missing-upgrade-335298
user:
client-certificate: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/missing-upgrade-335298/client.crt
client-key: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/missing-upgrade-335298/client.key

                                                
                                                

                                                
                                                
>>> k8s: cms:
Error in configuration: context was not found for specified context: kubenet-675779

                                                
                                                

                                                
                                                
>>> host: docker daemon status:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: docker daemon config:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/docker/daemon.json:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: docker system info:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon status:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon config:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/systemd/system/cri-docker.service.d/10-cni.conf:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: /usr/lib/systemd/system/cri-docker.service:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: cri-dockerd version:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: containerd daemon status:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: containerd daemon config:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: /lib/systemd/system/containerd.service:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/containerd/config.toml:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: containerd config dump:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: crio daemon status:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: crio daemon config:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/crio:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                

                                                
                                                
>>> host: crio config:
* Profile "kubenet-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-675779"

                                                
                                                
----------------------- debugLogs end: kubenet-675779 [took: 3.34294615s] --------------------------------
helpers_test.go:176: Cleaning up "kubenet-675779" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p kubenet-675779
--- SKIP: TestNetworkPlugins/group/kubenet (3.49s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium (5.4s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium
net_test.go:102: Skipping the test as it's interfering with other tests and is outdated
panic.go:615: 
----------------------- debugLogs start: cilium-675779 [pass: true] --------------------------------
>>> netcat: nslookup kubernetes.default:
Error in configuration: context was not found for specified context: cilium-675779

                                                
                                                

                                                
                                                
>>> netcat: nslookup debug kubernetes.default a-records:
Error in configuration: context was not found for specified context: cilium-675779

                                                
                                                

                                                
                                                
>>> netcat: dig search kubernetes.default:
Error in configuration: context was not found for specified context: cilium-675779

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local udp/53:
Error in configuration: context was not found for specified context: cilium-675779

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local tcp/53:
Error in configuration: context was not found for specified context: cilium-675779

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 udp/53:
Error in configuration: context was not found for specified context: cilium-675779

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 tcp/53:
Error in configuration: context was not found for specified context: cilium-675779

                                                
                                                

                                                
                                                
>>> netcat: /etc/nsswitch.conf:
Error in configuration: context was not found for specified context: cilium-675779

                                                
                                                

                                                
                                                
>>> netcat: /etc/hosts:
Error in configuration: context was not found for specified context: cilium-675779

                                                
                                                

                                                
                                                
>>> netcat: /etc/resolv.conf:
Error in configuration: context was not found for specified context: cilium-675779

                                                
                                                

                                                
                                                
>>> host: /etc/nsswitch.conf:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/hosts:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/resolv.conf:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> k8s: nodes, services, endpoints, daemon sets, deployments and pods, :
Error in configuration: context was not found for specified context: cilium-675779

                                                
                                                

                                                
                                                
>>> host: crictl pods:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: crictl containers:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> k8s: describe netcat deployment:
error: context "cilium-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe netcat pod(s):
error: context "cilium-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: netcat logs:
error: context "cilium-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns deployment:
error: context "cilium-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns pods:
error: context "cilium-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: coredns logs:
error: context "cilium-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe api server pod(s):
error: context "cilium-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: api server logs:
error: context "cilium-675779" does not exist

                                                
                                                

                                                
                                                
>>> host: /etc/cni:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: ip a s:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: ip r s:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: iptables-save:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: iptables table nat:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> k8s: describe cilium daemon set:
Error in configuration: context was not found for specified context: cilium-675779

                                                
                                                

                                                
                                                
>>> k8s: describe cilium daemon set pod(s):
Error in configuration: context was not found for specified context: cilium-675779

                                                
                                                

                                                
                                                
>>> k8s: cilium daemon set container(s) logs (current):
error: context "cilium-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: cilium daemon set container(s) logs (previous):
error: context "cilium-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe cilium deployment:
Error in configuration: context was not found for specified context: cilium-675779

                                                
                                                

                                                
                                                
>>> k8s: describe cilium deployment pod(s):
Error in configuration: context was not found for specified context: cilium-675779

                                                
                                                

                                                
                                                
>>> k8s: cilium deployment container(s) logs (current):
error: context "cilium-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: cilium deployment container(s) logs (previous):
error: context "cilium-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy daemon set:
error: context "cilium-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy pod(s):
error: context "cilium-675779" does not exist

                                                
                                                

                                                
                                                
>>> k8s: kube-proxy logs:
error: context "cilium-675779" does not exist

                                                
                                                

                                                
                                                
>>> host: kubelet daemon status:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: kubelet daemon config:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> k8s: kubelet logs:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/kubernetes/kubelet.conf:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: /var/lib/kubelet/config.yaml:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> k8s: kubectl config:
apiVersion: v1
clusters:
- cluster:
certificate-authority: /home/jenkins/minikube-integration/21808-367595/.minikube/ca.crt
extensions:
- extension:
last-update: Wed, 17 Dec 2025 21:14:32 UTC
provider: minikube.sigs.k8s.io
version: v1.35.0
name: cluster_info
server: https://192.168.85.2:8443
name: missing-upgrade-335298
contexts:
- context:
cluster: missing-upgrade-335298
extensions:
- extension:
last-update: Wed, 17 Dec 2025 21:14:32 UTC
provider: minikube.sigs.k8s.io
version: v1.35.0
name: context_info
namespace: default
user: missing-upgrade-335298
name: missing-upgrade-335298
current-context: missing-upgrade-335298
kind: Config
preferences: {}
users:
- name: missing-upgrade-335298
user:
client-certificate: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/missing-upgrade-335298/client.crt
client-key: /home/jenkins/minikube-integration/21808-367595/.minikube/profiles/missing-upgrade-335298/client.key

                                                
                                                

                                                
                                                
>>> k8s: cms:
Error in configuration: context was not found for specified context: cilium-675779

                                                
                                                

                                                
                                                
>>> host: docker daemon status:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: docker daemon config:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/docker/daemon.json:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: docker system info:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon status:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon config:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/systemd/system/cri-docker.service.d/10-cni.conf:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: /usr/lib/systemd/system/cri-docker.service:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: cri-dockerd version:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: containerd daemon status:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: containerd daemon config:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: /lib/systemd/system/containerd.service:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/containerd/config.toml:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: containerd config dump:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: crio daemon status:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: crio daemon config:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: /etc/crio:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                

                                                
                                                
>>> host: crio config:
* Profile "cilium-675779" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-675779"

                                                
                                                
----------------------- debugLogs end: cilium-675779 [took: 5.243296412s] --------------------------------
helpers_test.go:176: Cleaning up "cilium-675779" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p cilium-675779
--- SKIP: TestNetworkPlugins/group/cilium (5.40s)

                                                
                                    
Copied to clipboard